Menu
Version: 2.6.0
April 2021

Introduction

SSIMPLUS® VOD Monitor Inspector is the only Viewer Experience measurement software with the algorithm trusted by Hollywood to determine the best possible configuration for R&D groups, engineers, and architects who set up VOD encoding and processing workflows or make purchasing recommendations. SSIMPLUS VOD Monitor Inspector accurately analyzes and predicts the end viewer’s perception of video quality with adaptive SSIMPLUS Viewer Score and automated comparison. It presents complex algorithms in a simple, actionable interface, so you can know exactly what video quality the end viewer is going to experience by device and locate the finest detail of distortion. You can also use VOD Monitor Inspector to understand how a specific encoder or transcoder behaves across different content types, resolutions, and frame rates.

VOD Monitor Inspector determines the Source Quality and the encoder/transcoder performance (how different the output is from the source) and creates the overall Viewer Score of the output.

SSIMPLUS VOD Monitor Inspector features

Supported formats

Category Supported
Media container formats AV1, AVI, AVR, AVS, DV, FLV, GXF, H261, H263, H264, HEVC, HLS, IFF, IVR, LVF, LXF, M4V, MJ2, Mjpeg, Mjpeg_2000, MOV, MP4, MPEG-PS, MPEG-TS, MKV, MPV, MVE, MXF, VP9, V210, WebM, YUV, 3G2, 3GP, Y4M
Video codecs Apple ProRes, AV1, Google VP9, H.261, H.263 / H.263-1996, H.263+, H.263-1998, H.263 version2, H.264,AVC, MPEG-4AVC, MPEG-4 part 10, HEVC, JPEG 2000, JPEG-LS, MPEG-1 video, MPEG-2 video, MPEG-4 part 2, On2 VP6, On2 VP3, On2 VP5, On2 VP6, On2 VP7, On2 VP8, QuickTime Animation video, Theora, Windows Media Video 7, Windows Media Video 8, Windows Media Video 9
HDR formats HDR10, HLG*, Dolby Vision® (supported in .mp4*)

*other formats for Dolby Vision® content will be available in the future

Using VOD Monitor Inspector

Use the menu on the left to select the task that you want to perform. Then, use the work panes to refine the task and view information. As you work, use the toggles to expand or collapse the menu and work panes.

About search options

You can refine your search results in various ways, depending on where you are on the UI:

Managing jobs

About Reference files

When running a job, you can select a video to use as a reference file, meaning all of the other videos that you select for the job are compared against the reference file. When analyzing videos in full-reference mode, you can see the cross resolution. Reference files are usually the highest quality videos that you have compressed. If you do not flag a reference file, the videos run in single-reference mode. Full reference mode is available for Dolby Vision® content, as well as for HDR 10.

Add a video folder

For both on premise and virtual machine deployments, you must add paths to the videos stored on your network before you can add your first job. For information on mounting videos, refer to the SSIMPLUS® VOD Monitor Inspector deployment guide.

  1. From the menu, click + Add Job.
  2. In the left work pane, click + Add video folder.
  3. In the Add video folder dialog box, type the path for the folder location.
  4. Click Add.

Add an adaptive bitrate URL

You can add adaptive bitrate (ABR) URLs that contain the manifest of all of the profiles for a video stream.

  1. From the menu, click + Add Job.
  2. In the left work pane, click + Add ABR URL.
  3. In the Add ABR URL dialog box, type the network path.
  4. Click Add.

Add a job

You can run jobs concurrently.

  1. From the menu, click + Add Job.
  2. In the left work pane, navigate to the video folder or ABR URL folder.
  3. To select video files, click the file names. You can select multiple files.
  4. To select ABR URLs, click the file names. You can select multiple URLs.
  5. To remove a file from the job, perform one of the following actions:
    • In the left work pane, click the file name to clear the selection.
    • In the right work pane, beside the file name, click Remove.
  6. To set a reference file, in the right work pane, under Video, beside the file that you want to use as the reference file, click the Reference flag.
  7. In the right work pane, under Template, select a template or use the default template.
  8. Click Submit Job(s).
  9. In the right work pane, click Jobs Page link or, from the menu, click Jobs. On the In progress tab, you can observe VOD Monitor Inspector processing the files. When the job is complete, it appears on the Job List tab.

Cancel an in-progress job

  1. From the menu, click Jobs.
  2. Click the In progress tab.
  3. In the work pane, select a job.
  4. Click Cancel Job.

Delete a job from the Job List

If you need to, you can delete jobs from the Job List. Deleting jobs can make it easier to find jobs.

  1. From the menu, click Jobs.
  2. Click the Job List tab.
  3. In the work pane, select a job from the list.
  4. Click Delete.

Resubmit a job

If a job times out, you can resubmit it. You might also want to resubmit a job after making changes to the template, such as when adding more devices or changing the acceptable threshold.

  1. From the menu, click Jobs.
  2. Click the Job List tab.
  3. In the work pane, select a job from the list.
  4. Click Re-submit.

Related topic: Edit a template

Monitor jobs

  1. From the menu, click Jobs.

    On the In progress tab, the jobs currently in progress appear, along with the test video, the reference video, template, date submitted, status, time elapsed, and time remaining. To view more details on the template, click . While the videos are aligning, the status appears as Aligning. After the videos are aligned, the analysis starts.

    On the Job List tab, the completed jobs appear. This tab also shows the date and time that a job was submitted and how long it took to process. If a job fails, check the Note column for more information.

    Sometimes jobs fail because the frame rates of the reference and test video represent an unsupported cross-frame rate combination. In general terms, VOD Monitor Inspector supports the following cross-frame rate criteria:

    • The frame rate of reference video is the same as the frame rate of test videos.
    • The frame rate of reference video is two times the frame rate of test videos.
    • The difference between the frame rates of the reference video and test video is less than 0.01.
    • The difference between the frame rates of the reference video and two times the frame rate of the test video is less than 0.01.

    In addition to the general cross-frame rate rules above, VOD Monitor Inspector has been enhanced to support a number of common cross-rate scenarios arising when comparing Drop-Frame (DF) with Non Drop-Frame (NDF) videos, including:

    • 23.98 vs 24
    • 24 vs 23.98
    • 29.97 vs 30
    • 30 vs 29.97
    • 59.94 vs 60
    • 60 vs 59.94

Adjust the number of jobs that appear on the Jobs tabs

The number of jobs per page appears on the In Progress and Job List tabs below the table on the work pane. You can increase or decrease the number of jobs that appear in increments of 5, from a minimum of 5 to a maximum of 50. The default is 20. Jobs also appear on the System Admin page.

  1. On the In Progress or Job List tab, use the Jobs per page arrows at the bottom of the work pane to select the number of jobs that appear.

Managing results

When jobs are complete, you can view the analysis results on the Results page. The threshold (as defined in the template) appears on the SSIMPLUS Viewer Score (SVS) and Encoder Performance Score (EPS) tabs. Please see the Scoring Terminology section for explanation of the scoring terms used throughout the Results page.

Now, SSIMPLUS results also include a SSIMPLUS® Banding Score (SBS) which appears as the last numeric column of the results table. It also has its own SSIMPLUS Banding Score (SBS) tab, where you can explore the plotted scores in detail and click on each frame to see its corresponding Banding Map. Banding results are also available in the Quality Buckets tab where you can see what percent of each asset is above the Color Banding threshold recommended by the SSIMWAVE research team. A SBS for each frame will automatically appear in your exported CSV file. For full descriptions of the scoring terms, please consult the Scoring Terminology section.

View the results of a full-reference job

  1. From the menu, click Results.
  2. In the left work pane, under Choose a Video, click the job on the FULL-REFERENCE tab. In the right work pane, the Asset SVS and Asset EPS scores appear in the table, the reference file (represented as a white dotted line) and encoding outputs appear on the SSIMPLUS Viewer Score (SVS) tab.
  3. To view the Frame Index and Score for a specific frame, hover your mouse over a colored line segment.
  4. To open the Reference vs Test comparison pane, click the frame segment.
  5. To view file similarity between the source and the quality of the encoding, click the Encoder Performance Score (EPS) tab.
  6. To view video bitrates, click the Rate Quality Curve tab.
  7. To view quality break down, click the Quality Buckets tab.
  8. To view the video’s statistics, click the Statistics tab. The gray box indicates the Viewer Score.
  9. To view the results for a different device, in the right work pane, in the table, select a device from the Select Device drop-down list. Click Apply.

View a full-reference job’s Comparison pane, Quality Maps and Banding Maps

By using the Comparison pane, you can see a visualization of the video’s score. You can also use the Comparison pane to compare the highest quality reference video to the test video.

In the Comparison pane, you can turn on Quality Maps to see the test video impairments compared to the reference video in a gray scale map. Quality Maps are gray scale presentations of pixel-level perceptual quality that show the spatial distribution of impairments within a frame. Quality Maps illustrate where impairments occur at a pixel level. The maps provide the reason behind the quality score. Dark pixels show the impairments compared to the reference file. Areas that are not that important, such as the area around text, might have more white pixels. Generally, the darker the image, the lower the score.

In the Comparison pane, you can also turn on Banding Maps to see the areas of color banding as compared to the reference video in a binary map. SSIMPLUS Banding Maps measure color banding presence at a pixel level as viewed by an “expert” on an OLED TV using a no-reference approach. The map is generated as part of one of several steps used in computing a SSIMPLUS Banding Score (SBS). The banding map is a binary map with white pixels showing banding presence, and does not reflect pixel-level variations in banding impairment visibility.

  1. From the menu, click Results.
  2. From the left work pane, on the FULL-REFERENCE tab, select a job.
  3. In the right work pane, on the SSIMPLUS Viewer Score (SVS) tab, click a frame segment in the graph. The reference file appears on the left and the comparison file appears on the right. At the top of the screen, the title bar shows the file information, including the Viewer Score and the encoder/transcoder performance for both files.
  4. To perform a manual comparison of the two files, click and drag the blue slider.
  5. To view a SSIMPLUS Quality Map in the comparison pane, perform the following actions:
    1. On the title bar, next to the file information, click the blue arrow.
    2. Click Quality Maps.
  6. To view a SSIMPLUS Banding Map in the comparison pane, perform the following actions:
    1. On the title bar, next to the file information, click the blue arrow.
    2. Click Banding Maps.
  7. To change which files appear in the comparison pane, perform the following actions:
    1. On the title bar, next to the file information, click the blue arrow.
    2. Select the files that you want to appear.

View the results of a single-reference job

  1. From the menu, click Results.
  2. In the left work pane, under Choose a Video, click the job on the SINGLE-ENDED tab. In the right work pane, the encoding output appears on the SSIMPLUS Viewer Score (SVS) tab.
  3. To view the Frame Index and Score for a specific frame, hover your mouse over a line segment.
  4. To view quality break down, click the Quality Buckets tab.

To view the video’s mean, click the Statistics tab. The gray box indicates the Viewer Score.

Print job results

  1. From the menu, click Results.
  2. In the left work pane, select a job.
  3. In the right work pane, click Print page.
  4. In the print window, click Print.
  5. To create a PDF file, click Open PDF in Preview.

Add results

  1. From the menu, click Results.
  2. In the left work pane, select a job.
  3. Click + Add Results.
  4. Navigate to the file location.
  5. Click OK.

Remove results

  1. From the menu, click Results.
  2. In the left work pane, select a job.
  3. In the right work pane, in the table, click the checkbox next to a file.
  4. Click Remove Results.

Delete results

  1. From the menu, click Results.
  2. In the left work pane, select a job.
  3. In the right work pane, in the table, click the checkbox next to a file.
  4. Click Delete Results.

Download a CSV report

  1. From the menu, click Results.
  2. In the left work pane, select a job.
  3. In the right work pane, in the table, click the checkbox next to a file.
  4. Click Download CSV.

Import a CSV report

You can import a CSV report into VOD Monitor Inspector to create graphical depictions of the data.

  1. From the menu, click Results.
  2. Perform one of the following actions:
    • In the right work pane, click Import CSV report. Browse to the file location. Click the file. Click Open.
    • On your computer, navigate to the file. Click and drag it onto the right work pane.

Export a CSV report

You can create a spreadsheet with the data from a job.

  1. From the menu, click Results.
  2. In the right work pane, click Import CSV report.
  3. Browse to the file location.
  4. Click the file.
  5. Click Open.

Adjust Quality Buckets

The Quality Buckets tab shows the file information in a bar graph, as opposed to the Viewer Score line graph. In this view, you can see how the video frames stack up in quality buckets. You can change the size of the buckets to increase or decrease their granularity.

  1. From the menu, click Results.
  2. In the left work pane, select either a full-reference or single-reference job.
  3. In the right work pane, click the Quality Buckets tab.
  4. To select the size of the buckets that you want to show, at the bottom of the screen, select 5, 10, or 20.

Tip: Hover your mouse over a bar to see the percentage per video in each bucket.

Managing templates

Templates are applied to jobs and define how the job completes the video analysis.

As part of creating a template, you can set the Temporal Alignment options. Sometimes, encoding tools add or delete a frame on the output file, throwing off alignment between the source and the encoded file; Temporal Alignment corrects this disparity. When comparing multiple videos with different lengths, resolutions, and frame rates, VOD Monitor Inspector starts the analysis on the same frame in each video, ensuring an accurate score. Temporal alignment is on by default.

Create a template

  1. From the menu, select Settings > Templates.
  2. In the left work pane, click + New Template.
  3. In the right work pane, type a template name.
  4. Under Viewer Mode, select either the Regular or Expert checkbox.
  5. From the Devices table, select the checkboxes for the devices that you want to include. To select all devices, select the Device checkbox.
  6. Perform one or more of the following actions:
    • To add a device that is not included in the main Device list, click + Add Device. In the list, select the applicable devices. Click Apply.
    • To remove a device from the list, select the device. Click X Remove Devices.
  7. In the right work pane, click Advanced Settings.
  8. Define the Quality Threshold. The baseline for excellence is 80.
  9. Under Settings, select the options for the template, including Peak-Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and High Dynamic Range (HDR) for elementary streams and HLG. The PSNR and MSE do not display on the UI but are available in the downloaded CSV file.
  10. Under Temporal Alignment, select the options for the template, including Auto Temporal Alignment, Start Frame Reference, Start Frame Test, Frames to Process.
  11. Under Region of Interest (ROI), if you have videos with borders that you want to exclude, define the Reference and Test x and y axis and the width and height. The job analysis provides a score for that ROI only. You must know exactly how big the borders are for the top, bottom, and ends to match the reference and source for a full-reference comparison.
  12. To perform a manual analysis of the source file of either full-reference or single-reference raw, uncompressed videos, under Raw Video Parameters, define the Reference and Test width, height, frame rate (f), color (c), and bit depth (b).
  13. For Multi Program Transport Stream, or MPTS, under Program ID (PID), define the PID for both the Reference and Test videos.
  14. To include a comment in the results, perform the following actions:
    1. Select Remark.
    2. Type the comment in the field.
  15. If required, especially if using Watch Folders, define where you want to store the jobs after processing them, perform the following actions:
    1. Under Output Folders, click Error Folder.
    2. Select the location where you want to store the files. VOD Monitor Inspector moves the video files to the selected folders. A yellow triangle means the folder does not have write permissions; make sure to select a folder with write permissions.
    3. In the Acceptable Quality Threshold field, set the target percentage.
    4. Click Apply.
    5. Repeat steps a to c for the Acceptable Quality Folder and Unacceptable Quality Folder. The Acceptable Quality Folder is only available on premise (not in the Cloud version).
  16. Click Save.

Edit a template

  1. From the menu, select Settings > Templates.
  2. From the left work pane, select a template.
  3. Click Edit.
  4. Make your changes in the right work pane.
  5. Perform one of the following actions:
    • To save the template with the same name, click Save.
    • To save the template with a new name, click Save As. Type the new name for the template. Click Save.

Export a template as XML or JSON

You can export your templates if you want to create a backup version.

  1. From the menu, select Settings > Templates.
  2. Select a template.
  3. In the left work pane, click XML or JSON.
  4. Click OK.

Import a template

If you exported your template files, you can import them back into the system.

  1. From the menu, select Settings > Templates.
  2. In the left work pane, click Import.
  3. Navigate to the file.
  4. Click Open.

Delete a template

  1. From the menu, select Settings > Templates.
  2. From the left work pane, select a template.
  3. Click Delete.
  4. Click OK.

How to activate the SSIIMPLUS® Banding Score

  1. Choose a New Template..
  2. In the right side, under Advanced Settings, tick Color Banding Detection.
  3. Define the Color Banding Threshold.
    This threshold has been preset to 20 as SSIMWAVE's research has shown that SSIMPLUS® Banding Scores below that value are imperceptible. You can learn more about the SSIMPLUS® Banding Score and its scale in the Scoring Terminology section.

Managing comparisons

VOD Monitor Inspector uses the comparison page to compare different setups, including encoders, codec, encoder configurations, and content-types. The Rate SSIMPLUS Curve shows the video Quality Score.

For example, if you have a source video that was encoded using two (or more) different encoders into a full ladder (with different resolutions and bitrates), you can compare the quality scores visually to make informed decisions.

You can also use the Comparison page to calculate SSIMPLUS gains. The calculation uses the Bjontegaard metric for curve fitting.

  1. From the menu, click Comparison.
  2. In the left work pane, select the source video.
  3. In the right work pane, type a category name for a codec.
  4. From the Device list, select the target viewing device.
  5. From the Results list, under the category, select the videos that belong to the category.
  6. Click Apply.
  7. For each codec, perform the following actions:
    1. Click + Add Category.
    2. Repeat steps 3 to 6.
  8. For a quantitative comparison, from the drop-down Compare menus at the bottom of right work pane, select two categories.

About Player Test

Using Player Test, you can test how a standard, open-source streaming video player responds to network conditions during video playout of your HLS streams. Use the player to see how network conditions affect the Viewer Score, bitrate, variant switches, and buffer size. Player Test enables a three-point monitoring approach in SSIMPLUS Monitor Inspector.

Test video playout with Player Test

  1. From the menu, click Player Test.
  2. In the left work pane, select a video from the Full-Reference tab.
  3. In the right work pane, select a Viewer Device from the list. The devices in the list are populated from the template used to run the initial job for the video. You can switch devices during playout.
  4. On the video, click Play.
  5. Select one of the following options:
    • To display the immediate score of the video played thus far, click Current.
    • To display the average score for the entire viewing session, click Overall.
  6. The SSIMPLUS Viewer Score, Playout Bitrate, Download Rate, Buffer Length, and Variant display to the right of the video and graphically on the tabs below the video. Select a tab to view the graph.
  7. To view information for a specific frame, hover your mouse over a line segment.
  8. To view the video in full screen, click . The graph information overlays the video playout.
  9. In full screen mode, mouse over the video to change the graph transparency and switch between tabs.

About Watch Folders

Add a watch folder to automatically run jobs when files are added. Watch folders detect when you add a new file to the identified folders.

As with manually-run jobs, VOD Monitor Inspector requires reference and test video folders with corresponding reference videos and test videos. Create a new template or edit an existing template and identify the output folder for the videos. After VOD Monitor Inspector runs the jobs for the videos in the Watch Folder, the videos are removed from the Watch Folder.

While VOD Monitor Inspector processes the jobs, they display on the Jobs, first on the In Progress tab and then in the Jobs List, like manually-run jobs. VOD Monitor Inspector continues to run jobs until the Watch Folder is empty, scanning every 30 seconds for new files in the folders. To automate more jobs, add more video files to the Watch Folder.

You can add multiple Watch Folders.

After the VOD Monitor Inspector runs the jobs, you can check the Results folder. If you encounter errors, confirm that you have appropriate writer and move permissions, both of which are set during deployment. Contact SSIMWAVE support for further assistance at support@ssimwave.com

Add a Watch Folder

  1. Click Settings > Watch Folder.
  2. In the work pane, click + Add watch folder.
  3. In the left pane, to select a reference video folders ref from the existing folder directory, perform the following actions:
    • Click the folder icon to expand the directory.
    • Click the folder name to add it to the watch folder.
    • Repeat this step for the test video folder.
  4. In the right work pane, click the reference file icon to identify the reference file. If you do not select a reference file, the job will run as a Single-Ended job.
  5. Select a template from the Template drop-down list.
  6. Click Add Watch Folder.

Manage system settings

Set the maximum number of concurrent jobs

  1. From the menu, click Settings > System Admin.
  2. In the left work pane, select the node.
  3. In the right work pane, under Settings, set the number of Concurrent Jobs.

Set the job timeout threshold

The job timeout threshold is not the duration defined to run a job, but the time that can elapse before a job starts. If the job does not start within the threshold, the job fails.

  1. From the menu, click Settings > System Admin.
  2. In the right work pane, under Settings, set the Job Timeout field.

Monitor jobs in cluster environments

If you are working in a cluster environment, you can use the System Admin page to monitor jobs on different nodes.

  1. From the menu, click Settings > System Admin.
  2. In the left work pane, select the node.

In the right work pane, the jobs appear. Error messages display at the top of the screen.

Support Package

VOD Monitor Inspector can fetch various information from your running system and collect it into a compressed zip file called a support package. Typically, one generates a support package for the purpose of attaching it to a problem ticket created in SSIMWAVE's VOD Monitor Support system. Support packages are sufficiently useful, however, that you may find them useful for performing data backups and/or system migrations, as well.

Currently, the following items can be included in a support package:

Scoring Terminology

Within VOD Monitor Inspector, you have the following SSIMPLUS measurements:

SSIMPLUS Viewer Score (SVS)

A SSIMPLUS Viewer Score (SVS) is a measurement of the overall quality experienced by the viewer, as it considers both the quality of the original asset and the impact of the encoding process as perceived by the human visual system. It is the single, authoritative, value by which all video can be measured.

SSIMPLUS Encoder Performance Score (EPS)

A SSIMPLUS Encoder Performance Score (EPS) assumes that the source asset is pristine and focuses solely on measuring the degradation introduced by encoder impairments. EPS’s are best suited to judge the comparative quality of various encoder and/or encoder settings.

Both SVS and EPS scores take into account the human visual system and the device on which the asset is being viewed.

Asset SVS and EPS

In addition to frame-level measurements, SSIMPLUS VOD Monitor Inspector provides asset-level scores, which employ additional intelligence beyond the simple mathematical average of frame scores to arrive at a single score for an entire asset. An asset SSIMPLUS Viewer Score (Asset SVS), for example, provides a value that can be used to judge the overall quality of an entire asset, taking into account the lingering effect of poor quality frames, as perceived by the human visual system. The asset SVS is particularly useful when your goal is to perform source selection on a population of assets by first removing all those with unacceptably low overall quality. Similarly, an asset Encoder Performance Score (Asset EPS) can provide a single judgement of the encoder’s performance, across an entire asset.

SSIMPLUS Banding Score (SBS)

A SSIMPLUS Banding Score (SBS) is a measurement of the amount of color banding that is present within the asset. Banding, otherwise known as color contouring, is a common artifact of the video compression process and due to the quantization introduced by the video codec. It is most obvious in areas where the content calls for the rendering of smooth gradient regions, such as the sky, water or ground. In these regions, pixel values that are meant to be represented uniquely so as to provide a gentle transition from one shade or color to another are, instead, aggregated into a much smaller set of values creating discrete bands of colors. The SSIMPLUS algorithm analyzes each frame of your asset for the presence of banding and records a score indicating the severity of the banding, according to the following scale:

The following frame is an example of an SBS of 0, meaning that there is no discernible banding present:
This next frame is example of an SBS of 68, which would qualify as annoying. Notice the discrete bands of colors both in the sky and the asphalt track.

SSIMPLUS Algorithm

Introduction

SSIMPLUS stands for the “Structural Similarity Plus index” for video quality assessment (VQA). It was developed based on the Structural Similarity (SSIM) index which has been widely used in both academia and industry for perceptual quality assessment. The SSIM algorithm was recognized by the Television Academy and awarded the 67th Prime-Time Engineering Emmy Award in 2015. SSIMPLUS has improved upon SSIM in terms of video QoE accuracy, speed, applicability and flexibility. It has been the driving force behind SSIMWAVE's development of innovative software solutions which are revolutionizing the delivery of video from end to end allowing users to measure, maximize and monetize their efforts.

The algorithm was developed with the end user in mind and the objective of preserving creative intent. It enables users to access video QoE scores at any point within the video delivery chain in order to pinpoint the cause of QoE degradation.

SSIMPLUS possesses unique features that create the most accurate and most complete QoE scores, including cross-content, cross-resolution, and cross-frame rate video quality assessment. It produces three quality scores so users can gain a full understanding of any modification that causes a change in overall video QoE. These scores include the source QoE score, a device-adaptive perceptual fidelity score, and a device-adaptive uniform output QoE score, as shown in Fig. 2. The source score is a quality measurement for input/source video of an encoder/transcoder. The perceptual fidelity score is a relative quality measurement for output/test video compared with input/source video. The uniform output score is an absolute quality description for output/test video that predicts the human opinion towards its quality.

Fig. 2 A framework of SSIMPLUS

Single ended measurement

The source video quality assessment is a single ended no-reference VQA metric that takes the source video of an encoder/transcoder as the only input and provides QoE scores based on the designed perceptual quality model that simulates the human visual system. SSIMPLUS® VOD Monitor Inspector makes use of a hybrid VQA framework to capture the QoE degradation and artifacts that are most distracting to human eyes. The source QoE score allow users to have an objective evaluation of the video quality they received from content providers and leverage potential issues further up the chain.

Reference measurement

The reference based perceptual fidelity measurement is a double ended full-reference VQA metric that takes both the source and output videos and provides QoE scores for the output video based on the designed SSIM measure with enhanced capabilities. It not only takes into account the SSIM, but also considers the spatial and temporal masking effect of the human visual system and cognitive visual attention model. The perceptual fidelity scores can be used to evaluate the performance of adopted encoders or transcoders. The perceptual fidelity score is device-adaptive allowing it to account for the varying characteristics of playback devices including, but not limited to viewing distance, screen resolution, luminance, and size. Without these considerations, perceptual fidelity scores could vary drastically for the same video.

Absolute measurement

The perceptual fidelity score assumes pristine quality of the source video, however most users have no control over the source video quality. Thus, it is problematic to use perceptual fidelity scores as a quality indicator for output videos. The uniform output QoE score is designed to measure the output video quality without assuming the source video quality is perfect. It combines the source QoE and perceptual fidelity scores to describe the absolute perceptual quality of the test video. The uniform output QoE score is also device-adaptive making it the best quality indicator to examine the final output of adopted encoder or transcoders.

Detailed quality maps

SSIMPLUS accomplishes a remarkable degree of accurate localization and provides an inspection of quality down to a pixel-level granularity. The algorithm also allows its users to perform precise and in-depth evaluation, comparison, design, and optimization of their video acquisition, processing, compression, storage, reproduction, delivery, and display systems.

Cross-content video quality measurement

The major drawback of conventionally used video quality indicators, peak signal-to-noise-ratio (PSNR) or mean squared error (MSE), is the inability to align content. For example, when looking at video compression using PSNR as the measure of quality it is true that higher bitrate is equal to better quality when comparing the same content. However, when we compare simple content, such as talking heads on a news station, to complex content, such as a fast-moving football game, the same PSNR would lead to drastically different perceptual quality. SSIMPLUS scores are content agnostic and easier to use and understand as they are set on a scale from 0-100.

Cross-resolution video quality measurement

Both conventional (PSNR, MSE) and state-of-the-art (VQM, MOVIE, SSIM) full-reference VQA methods require that reference and test videos have exactly the same resolutions. However, this is not always the case, especially during the encoding ladder design. The source video has the highest resolution and has been transcoded to multiple test videos with different resolutions and bitrates. The traditional way to calculate video quality in this case is by either up-sampling the test video or down-sampling the source video. However, from the quality evaluation perspective, neither of these options are correct because both up-sampled and down-sampled videos would introduce extra artifacts. Instead of signal matching, SSIMPLUS solves this problem from the human visual system modeling perspective towards perceptual quality estimation. It has the capability to accurately and efficiently determine the device-adaptive QoE for test video.

Cross-frame rate video quality measurement

Similar to the cross-resolution feature, SSIMPLUS’ cross-frame rate capability solves the problem of reference and test videos having different frame rates. It supports different frame rate test video quality measurement compared to a maximum of 120fps of source video. This feature not only allows users to compare the performance of encoders/transcoders with different configurations, but also enables users to design a brand new encoding ladder that takes frame rate into consideration, in addition to resolution and bitrate.

Consumer and expert mode for all supported devices

Consumers and experts have drastically different behavior during video QoE inspection. Consumers look at the overall impression of video quality, but experts examine the details of quality degradation, such as loss of detail, blur, noise, false contour, blockiness and flickering. The viewing environment is also different as consumers watch videos on a TV in their living room and experts inspect videos by standing very close to a monitor. SSIMPLUS provides the consumer and expert modes for any of the supported viewing devices. For the same video and device, the expert scores are generally lower than the consumer scores.

Client-side QoE monitoring

The three scores (source QoE, perceptual fidelity, and uniform output QoE) mentioned above are primarily generated on the server side to assess the performance of the video delivery workflow. SSIMPLUS can also be deployed on the client-side to collect the final video QoE of consumers. This is a new module with capability to capture the artifacts introduced by video deliveries throughout the network, such as initial waiting, playback stalling, packet/frame drop after error concealment and quality switching.

Features

Understanding of the human visual system obtained through psychophysical studies performed by vision scientists has been evolving. The results of these studies are often empirical and cannot be directly employed in real-world applications. SSIMWAVE builds and employs advanced computational models, based on state-of-the-art research conducted in the field of vision science, to analyze the perceptual quality of video content. SSIMPLUS employs these advanced computational models to offer many features currently not provided by any other product in the industry:

Accurate & straightforward

SSIMPLUS automates real-time, accurate and easy-to-use video QoE evaluation for quality monitoring, tracking, assurance, auditing, and control. The solution provides straightforward predictions on what an average consumer would say about the quality of delivered video content on a scale of 0-100 and categorizes the quality as either bad, poor, fair, good, or excellent.

Incredible speed

Video QoE measurement is a computationally demanding task as the models that perform reasonably well are considerably slow [4, 5, 9] and cannot be employed to perform real-time video QoE measurement. Our deep understanding of perceptual video quality assessment and human visual systems enables us to design and develop accurate and fast video QoE measurement algorithms. SSIMPLUS provides an optimized monitor that performs QoE of a 4K resolution video faster than real-time.

Device adaptive

Viewing conditions and the properties of a display device have a major impact on the perceived video quality. Our computational models consider the display device and viewing conditions as input before predicting the perceptual quality of a video. SSIMPLUS provides support for most commonly used devices. A full list of available devices can be found in Appendix B. Please note that we can always add your target device in to the list because SSIMPLUS is able to handle any display device.

Cross-resolution QoE measurement

Often, the reference and distorted video inputs are not of the same resolution. SSIMPLUS has the capability to accurately and efficiently determine monitor device-adaptive QoE when a distorted video does not have the same resolution as that of the reference video.

Detailed quality maps

SSIMPLUS accomplishes a remarkable degree of accurate localization and empowers its users to perform precise and in-depth evaluation, comparison, design, and optimization of their video acquisition, processing, compression, storage, reproduction, delivery, and display systems.

Reliable, robust & easy-to-use

The product has gone through extensive detailed testing, validation and comparison using in house and publicly available subject-rated video quality assessment databases. SSIMPLUS can be easily embedded into existing visual communication systems for the determination and evaluation of resource allocation strategies based on desired visual QoE and available network resources.

Appendix

Appendix A: Comparative results

The report compares the performance of SSIMPLUS to the following most popular and widely used video quality assessment measures in industry & academia:

Perceptual quality prediction accuracy

The ultimate goal of VQA algorithms is to predict subjective quality evaluation of a video. Therefore, the most important test is to evaluate how well they predict subjective scores. Recently, a subjective study was conducted by JCT-VC members to quantify the rate-distortion gain of the HEVC codec against a similarly configured H.264/AVC codec [1]. The database is very relevant for evaluation of video quality assessment algorithms developed for the media & entertainment industry because it contains videos distorted by most commonly used video compression standard along with the recently developed H.265 codec. We use this independent and challenging subjective database to compare the performance of the VQA algorithms in predicting the perceptual quality. The performance comparison results are provided in Table 1. For this purpose, we employ five evaluation metrics to assess the performance of VQA measures:

Among the above metrics, PLCC, MAE and RMS are adopted to evaluate prediction accuracy [7], and SRCC and KRCC are employed to assess prediction monotonicity[7]. A better objective VQA measure should have higher PLCC, SRCC and KRCC while lower MAE and RMS values. All of these evaluation metrics are adopted from previous VQA studies [7, 6]. We can observe from the results provide in Table 1 that SSIMPLUS not only outperforms the popular VQA quality measures in terms of perceptual quality prediction accuracy but also in terms of computation time. Additionally, SSIMPLUS has many exclusive features (listed in Section 2) not offered by any other VQA measure.

Table 1: Perceptual quality prediction performance comparison


PLCC MAE RMS SRCC KRCC Computation time(normalized)
PSNR 0.5408 1.1318 1.4768 0.5828 0.3987 1
MOVIE 0.7164 0.9711 1.2249 0.6897 0.4720 3440.27
VQM 0.8302 0.7771 0.9768 0.8360 0.6243 174.53
SSIM 0.8422 0.8102 0.9467 0.8344 0.6279 22.65
MS-SSIM 0.8527 0.7802 0.9174 0.8409 0.6350 48.49
SSIMPLUS 0.8678 0.7160 0.8724 0.8745 0.6737 7.83

Device-adaptation capability

The above test results assume a single fixed viewing device,which is a common assumption made by existing state-of-the-art VQA models. The capability of SSIMPLUS is beyond the limitation of existing models. In particular, SSIMPLUS is designed to inherently consider the viewing conditions such as display device and viewing distance. Due to the unavailability of public subject-rated video quality assessment databases that contain subject-rated video sequences watched under varying viewing conditions, we performed a subjective study in order to test the device-adaptive capability of the SSIMPLUS algorithm.

Subjective study

The main purpose of the study is to observe how the state-of-the-art VQA algorithms adapt to varying viewing conditions. A set of raw videos sequences, consisting of 1080p and 640p resolutions, was compressed at various distortion levels to obtain bitstreams compliant to H.264 video compression standard. The decompressed distorted video sequences were rated by subjects under the following viewing conditions:

Performance comparison

The mean opinion scores (MOS) provided by subjects were used to compare the performance of SSIMPLUS with state-of-the-art VQA measures. The scatter plots of the VQA algorithms under comparison are shown in Figures 31 - 40. The superior performance of the SSIMPLUS algorithm compared to the other VQA algorithms is evident from the figures.

Comparisons between the VQA algorithms using PLCC, MAE, RMS, SRCC, and KRCC are provided in Tables 2 - 7. We can observe from the results that the SSIMPLUS algorithm outperforms other state-of-the-art video compression algorithms. The main purpose of the subjective study (refer to section A) is to observe the adaptation behavior of the state-of-the-art VQA measures when deployed for predicting the perceptual quality of video content viewed under different viewing conditions. Table 6 compares the performance of the VQA measures when the TV viewing distance is reduced to 20 inches (referred to as expert mode). SSIMPLUS adapts to the changes in the viewing conditions better than the VQA algorithms under comparison. SSIMPLUS is considerably faster than the other quality measures proposed to predict perceptual quality of video content and meets the requirements for real-time computation of perceptual video QoE and the detailed quality map.



Table 2: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: iPhone 5S, viewing distance: 10 inches)

Model Resolution PLCC MAE RMS SRCC KRCC
PSNR 640p & 1080p 0.8974 6.2667 9.0641 0.9277 0.7633
SSIM 640p & 1080p 0.9498 4.1694 6.4252 0.9604 0.8249
MS-SSIM 640p & 1080p 0.9186 5.2874 8.1157 0.9438 0.7941
VQM 640p & 1080p 0.8939 6.2125 9.2098 0.9324 0.7736
MOVIE 640p & 1080p 0.9030 6.1677 8.8268 0.9318 0.7710
PQR-Tek 640p & 1080p 0.4605 13.853 18.234 0.4694 0.3323
DMOS-Tek 640p & 1080p 0.4645 13.864 18.191 0.4694 0.3323
JND-VC 640p & 1080p 0.9423 4.8685 6.8740 0.9448 0.7941
DMOS-VC 640p & 1080p 0.8729 6.9934 10.022 0.9116 0.7377
SSIMPLUS 640p & 1080p 0.9781 3.0251 4.2715 0.9529 0.8275

Table 3: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: iPad Air, viewing distance: 16 inches)

Model Resolution PLCC MAE RMS SRCC KRCC
PSNR 640p & 1080p 0.9097 7.7111 9.4030 0.8616 0.6684
SSIM 640p & 1080p 0.9332 6.5561 8.1391 0.8860 0.7146
MS-SSIM 640p & 1080p 0.8986 8.3154 9.9370 0.8364 0.6427
VQM 640p & 1080p 0.8971 8.2887 10.003 0.8457 0.6479
MOVIE 640p & 1080p 0.9114 7.8819 9.3206 0.8709 0.6812
PQR-Tek 640p & 1080p 0.8656 8.2467 11.337 0.8730 0.7146
DMOS-Tek 640p & 1080p 0.8681 8.1190 11.239 0.8730 0.7146
JND-VC 640p & 1080p 0.9395 6.2170 7.7568 0.9193 0.7556
DMOS-VC 640p & 1080p 0.8420 9.4575 12.216 0.7812 0.5863
SSIMPLUS 640p & 1080p 0.9701 4.5263 5.4991 0.9131 0.7659

Table 4: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Lenovo W530 laptop, viewing distance: 20 inches)

Model Resolution PLCC MAE RMS SRCC KRCC
PSNR 640p & 1080p 0.9122 7.6379 9.6722 0.8751 0.6940
SSIM 640p & 1080p 0.9216 7.4738 9.1659 0.8876 0.7146
MS-SSIM 640p & 1080p 0.8883 8.5300 10.841 0.8388 0.6427
VQM 640p & 1080p 0.8981 8.5620 10.383 0.8560 0.6607
MOVIE 640p & 1080p 0.9175 7.5530 9.3934 0.8852 0.7017
PQR-Tek 640p & 1080p 0.9350 6.3503 8.3737 0.9304 0.7890
DMOS-Tek 640p & 1080p 0.9446 5.7194 7.7513 0.9304 0.7890
JND-VC 640p & 1080p 0.9312 7.0154 8.6077 0.9177 0.7428
DMOS-VC 640p & 1080p 0.8248 10.5753 13.349 0.7772 0.5786
SSIMPLUS 640p & 1080p 0.9698 4.7388 5.7593 0.9227 0.7813

Table 5: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Samsung TV 55” viewing distance: 90 inches)

Model Resolution PLCC MAE RMS SRCC KRCC
PSNR 640p & 1080p 0.9343 6.4934 8.2855 0.9034 0.7248
SSIM 640p & 1080p 0.9438 6.1363 7.6822 0.9140 0.7505
MS-SSIM 640p & 1080p 0.9126 7.3825 9.5003 0.8742 0.6786
VQM 640p & 1080p 0.9242 7.3915 8.8743 0.8914 0.6992
MOVIE 640p & 1080p 0.9345 6.6421 8.2690 0.9108 0.7377
PQR-Tek 640p & 1080p 0.9572 5.3377 6.7291 0.9435 0.8044
DMOS-Tek 640p & 1080p 0.9462 5.9821 7.5147 0.9435 0.8044
JND-VC 640p & 1080p 0.9512 5.8066 7.1664 0.9394 0.7864
DMOS-VC 640p & 1080p 0.8485 9.4566 12.295 0.8150 0.6171
SSIMPLUS 640p & 1080p 0.9856 3.2147 3.9271 0.9464 0.8172

Table 6: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Samsung TV 55”, viewing distance: 20 inches)

Model Resolution PLCC MAE RMS SRCC KRCC
PSNR 640p & 1080p 0.9204 7.5788 9.7918 0.8891 0.7077
SSIM 640p & 1080p 0.9322 7.5625 9.0674 0.9113 0.7487
MS-SSIM 640p & 1080p 0.9019 8.9489 10.820 0.8709 0.6872
VQM 640p & 1080p 0.9185 8.3203 9.9051 0.8777 0.6821
MOVIE 640p & 1080p 0.9240 7.6532 9.5777 0.9000 0.7205
PQR-Tek 640p & 1080p 0.7472 14.312 16.647 0.7227 0.5205
DMOS-Tek 640p & 1080p 0.7632 13.773 16.185 0.7221 0.5196
JND-VC 640p & 1080p 0.9357 7.1399 8.8356 0.9300 0.7692
DMOS-VC 640p & 1080p 0.8481 10.3025 13.272 0.8174 0.6205
SSIMPLUS 640p & 1080p 0.9708 5.1424 6.0055 0.9311 0.7897

Table 7: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS including all devices

Model PLCC MAE RMS SRCC KRCC Computation time(normalized)
PSNR 0.9062 7.4351 9.8191 0.8804 0.6886 1
SSIM 0.9253 6.9203 8.8069 0.9014 0.7246 22.65
MS-SSIM 0.8945 8.1969 10.384 0.8619 0.6605 48.49
VQM 0.8981 8.0671 10.214 0.8703 0.6711 174.53
MOVIE 0.9096 7.4761 9.6493 0.8892 0.7001 3440.27
JND-Tek 0.7615 11.372 15.052 0.6972 0.5241 54.22
DMOS-Tek 0.7568 11.478 15.180 0.6969 0.5236 54.22
JND-VC 0.9289 6.7096 8.5986 0.9206 0.7469 443.15
DMOS-VC 0.8365 9.9292 12.724 0.8090 0.6027 13.78
SSIMPLUS 0.9732 4.3192 5.3451 0.9349 0.7888 7.83

Appendix B: Supported devices

DEVICE OPTION DEVICE NAME MANUFACTURER
ssimpluscore SSIMPLUSCore SSIMPLUS
iphone6plus iPhone 6 Plus Apple
iphone6 iPhone 6 Apple
iphone5s iPhone 5S Apple
galaxys5 Galaxy S5 Samsung
galaxynote4 Galaxy Note 4 Samsung
onem8 One (M8) HTC
nexus6 Nexus 6 Motorola
ipadmini2 iPad Mini 2 Apple
ipadair2 iPad Air 2 Apple
galaxytabs Galaxy Tab S Samsung
nexus7 Nexus 7 Asus
surface Surface Microsoft
f8500 F8500 Samsung
vt60 VT60 Panasonic
oled65g6p OLED65G6P LG
h7150 H7150 Samsung
as600 AS600 Panasonic
ea9800 EA9800 LG
x9 X9 Sony
hu9000 HU9000 Samsung
27mp35hq 27MP35HQ LG
xl2420t XL2420T BenQ
b296cl B296CL Acer
vg27he VG27HE Asus
lt3053 LT3053 Lenovo
pa242w PA242W NEC
u2713hm U2713HM Dell
macbookpro Macbook Pro Apple
thinkpadw540 ThinkPad W540 Lenovo
aspires7 Aspire S7 Acer
xps15 XPS 15 Dell
economyclasshds1 Economy Class HD S1 Panasonic
economyclassfhds2 Economy Class FHD S2 Panasonic
economyclassfhds3 Economy Class FHD S3 Panasonic
businessclassfhdm1 Business Class FHD M1 Panasonic
businessclassfhdm2 Business Class FHD M2 Panasonic
businessclassuhdm3 Business Class UHD M3 Panasonic
firstclassfhdl1 First Class FHD L1 Panasonic
firstclassuhdl2 First Class UHD L2 Panasonic
tx-40cx680b TX-40CX680B Panasonic
up3216q UP3216Q Dell
imac275k iMac 27 5K Apple
imac2154k iMac 21.5 4K Apple
ue40ju6400 UE40JU6400 Samsung
50put6400 50PUT6400 Phillips
kd-55x8509c KD-55X8509C Sony
ue55js9000t UE55JS9000T Samsung
oled55e7n OLED55E7N LG
galaxys6edge Galaxy S6 Edge Samsung
nexus9 Nexus 9 HTC
macbookair13inch Macbook Air 13inch Apple
smallscreen SmallScreen SSIMPLUS
oled65c7p OLED65C7P LG
oled55c7p OLED55C7P LG
iphonex iPhone X Apple
ipadpro iPad Pro Apple
oled55c8pua OLED55C8PUA LG
xbr-55a8f XBR-55A8F Sony
qn55q8fnbfxza QN55Q8FNBFXZA SamSung
ipad2017 iPad 2017 Apple

Appendix C: Supported video formats

SSIMPLUS supports most of the popular video compression formats and video containers. The details of supported video codecs and containers can be found below.

Media Container Formats:

AVI Audio Video Interleaved
AVR Audio Visual Research
AVS AVISynth
DV Digital Video
FLV Flash Video
GXF General eXchange Format)
H261 Raw H.261 video
H263 Raw H.263 video
H264 Raw H.264 video
HEVC Raw HEVC video
IFF Interchange File Format
MVE Interplay MVE
IVR Internet Video Recording
LVF DVR
LXF VR native stream
M4V Raw MPEG-4 video
WebM Matroska/WebM
Mjpeg Raw MJPEG video
Mjpeg_2000 Raw MJPEG 2000 video
MOV Apple Quicktime
MP4 MPEG-4 Part 14
3GP 3GPP Multimedia File (GSM)
3G2 3GPP Multimedia File (CDMA)
MJ2 Motion JPEG 2000
MPEG-PS MPEG-2 Program Stream
MPEG-TS MPEG-2 Transport Stream
MKV Matroska Multimedia Container
MPV MPEG-2 Manzanita MP2TSME/MP2TSAE
MXF Material eXchange Format
YUV Raw Video
V210 Uncompressed 4:2:2 10-bit

HDR Formats:

HDR10 HDR10 Media Profile

Codecs:

H.261
H.263 / H.263-1996, H.263+ / H.263-1998 / H.263 version 2
H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10
HEVC (High Efficiency Video Coding)
JPEG 2000
JPEG-LS
MPEG-1 video
MPEG-2 video
MPEG-4 part 2
Apple ProRes
QuickTime Animation (RLE) video
Theora
On2 VP3
On2 VP5
On2 VP6
On2 VP6 (Flash version, with alpha channel)
On2 VP6 (Flash version)
On2 VP7
On2 VP8 (decoders: vp8 libvpx vp8_cuvid )
Google VP9 (decoders: vp9 libvpx-vp9 vp9_cuvid )
Windows Media Video 7
Windows Media Video 9

No-Reference Scores:

Raw/Uncompressed Video
H.264
H.265
MPEG-2 video
MPEG-4
Apple ProRes

About SSIMWAVE

At SSIMWAVE, science meets art to make sure each video stream delivered makes its way to a happy subscriber. SSIMWAVE tunes video content quality to balance feasibility with the best experience possible. Its entrepreneurial group of scientists, software engineers, and business professionals are fascinated with perfecting digital video delivery. SSIMWAVE leverages its Primetime Emmy® Award winning technology to make the unknown, known; the subjective, objective; to reliably deliver video quality levels subscribers expect, and are willing to pay for. Viewers see and hear the results of SSIMWAVE’s work everywhere – while streaming the season finale on the big screen or watching the game-winning shot on their smartphones.

Copyright © 2020 SSIMWAVE Inc. All Rights Reserved.
SSIMWAVE, SSIMPLUS, and VOD Monitor Inspector are all trademarks of SSIMWAVE Inc.
Google Chrome is a trademark of Google Inc.

Dolby, Dolby Vision and the double-D symbol are all trademarks of Dolby Laboratories. Manufactured under license from Dolby Laboratories. Confidential unpublished works. Copyright ©2013-2015 Dolby Laboratories. All rights reserved.

All other trademarks are the property of their respective owners.

References

  1. V. Baroncini, J. R. Ohm, and Sullivan G. J. Report on preliminary subjective testing of hevc compression capability. In JCT-VC of ITU-T SG16 WP3 and ISO/IEC JTC1/SC29/WG11, San Jose, CA, February 2012. 30
  2. Video Clarity. Understanding MOS, JND and PSNR. http://videoclarity.com/wpunderstandingjnddmospsnr/, 2014. [Online; accessed September 07, 2014]. 30
  3. Tektronix Inc. Understanding PQR, DMOS and PSNR Measurements. http://www.tek.com/ document/fact-sheet/understanding-pqr-dmos-and-psnr-measurements/, 2014. [Online; accessed September 12, 2014]. 30
  4. M.H.PinsonandS.Wolf.A new standardized method for objectively measuring video quality. IEEE Trans. Broadcasting, 50(3):312–322, 2004. 5, 30
  5. K. Seshadrinathan and A. C. Bovik. Motion tuned spatio-temporal quality assessment of natural videos. IEEE Trans. Image Processing, 19(2):335–350, February 2010. 5, 30
  6. H. R. Sheikh, M. Sabir, and A. C. Bovik. A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Processing, 15(11):3440–3451, November 2006. 31, 32
  7. VQEG. Final report from the video quality experts group on the validation of objective models of video quality assessment. Technical report, available at http://www.vqeg.org/, Apr 2000. 31, 32
  8. Z.Wang,A.C.Bovik,H.R.Sheikh,andE.P.Simoncelli. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Processing, 13(4):600–612, April 2004. 4, 30
  9. Z. Wang, E. P. Simoncelli, and A. C. Bovik. Multi-scale structural similarity for image quality assessment. In Proc. IEEE Asilomar Conf. on Signals, Systems, and Computers, pages 1398–1402, Pacific Grove, CA, November 2003. 5, 30

Additional Help

If you require assistance, email SSIMWAVE Support at support@ssimwave.com or call 519-489-2688.