
Introduction
SSIMPLUS® VOD Monitor Inspector is the only Viewer Experience
measurement software with the algorithm trusted by Hollywood
to determine the best possible configuration for R&D
groups, engineers, and architects who set up VOD encoding and
processing workflows or make purchasing recommendations.
SSIMPLUS VOD Monitor Inspector accurately analyzes and
predicts the end viewer’s perception of video quality with
adaptive SSIMPLUS Viewer Score and automated comparison. It
presents complex algorithms in a simple, actionable interface,
so you can know exactly what video quality the end viewer is
going to experience by device and locate the finest detail of
distortion. You can also use VOD Monitor Inspector to
understand how a specific encoder or transcoder behaves across
different content types, resolutions, and frame rates.
VOD Monitor Inspector determines the Source Quality and the
encoder/transcoder performance (how different the output is
from the source) and creates the overall Viewer Score of the
output.
SSIMPLUS VOD Monitor Inspector features
- Single-ended measurement
- Reference measurement
- Per-frame scores
- Detailed quality maps
- Cross-content video quality measurement
- Cross-resolution video quality measurement
- Cross-frame rate video quality measurement
- Regular and expert mode for all supported devices
Supported formats
Category | Supported |
---|---|
Media container formats | AV1, AVI, AVR, AVS, DV, FLV, GXF, H261, H263, H264, HEVC, HLS, IFF, IVR, LVF, LXF, M4V, MJ2, Mjpeg, Mjpeg_2000, MOV, MP4, MPEG-PS, MPEG-TS, MKV, MPV, MVE, MXF, VP9, V210, WebM, YUV, 3G2, 3GP, Y4M |
Video codecs | Apple ProRes, AV1, Google VP9, H.261, H.263 / H.263-1996, H.263+, H.263-1998, H.263 version2, H.264,AVC, MPEG-4AVC, MPEG-4 part 10, HEVC, JPEG 2000, JPEG-LS, MPEG-1 video, MPEG-2 video, MPEG-4 part 2, On2 VP6, On2 VP3, On2 VP5, On2 VP6, On2 VP7, On2 VP8, QuickTime Animation video, Theora, Windows Media Video 7, Windows Media Video 8, Windows Media Video 9 |
HDR formats | HDR10, HLG*,
Dolby Vision® (supported in .mp4*) *other formats for Dolby Vision® content will be available in the future |
Using VOD Monitor Inspector
Navigate VOD Monitor Inspector
Use the menu on the left to select the task that you want to
perform. Then, use the work panes to refine the task and view
information. As you work, use the toggles to expand or
collapse the menu and work panes.
About search options
You can refine your search results in various ways, depending on where you are on the UI:
- Filter by keyword in folders that are currently expanded.
- Sort by date or file name.
- Use the Advanced Search to search jobs by file name, template name, date submitted, and date processed. Clear your search filters when you no longer need them.
Managing jobs
About Reference files
When running a job, you can select a video to use as a
reference file, meaning all of the other videos that you
select for the job are compared against the reference file.
When analyzing videos in full-reference mode, you can see the
cross resolution. Reference files are usually the highest
quality videos that you have compressed. If you do not flag a
reference file, the videos run in single-reference mode. Full
reference mode is available for Dolby Vision® content, as well
as for HDR 10.
Add a video folder
For both on premise and virtual machine deployments, you must add paths to the videos stored on your network before you can add your first job. For information on mounting videos, refer to the SSIMPLUS® VOD Monitor Inspector deployment guide.
- From the menu, click + Add Job.
- In the left work pane, click + Add video folder.
- In the Add video folder dialog box, type the path for the folder location.
- Click Add.
Add an adaptive bitrate URL
You can add adaptive bitrate (ABR) URLs that contain the manifest of all of the profiles for a video stream.
- From the menu, click + Add Job.
- In the left work pane, click + Add ABR URL.
- In the Add ABR URL dialog box, type the network path.
- Click Add.
Add a job
You can run jobs concurrently.
- From the menu, click + Add Job.
- In the left work pane, navigate to the video folder or ABR URL folder.
- To select video files, click the file names. You can select multiple files.
- To select ABR URLs, click the file names. You can select multiple URLs.
- To remove a file from the job, perform one of the
following actions:
- In the left work pane, click the file name to clear the selection.
- In the right work pane, beside the file name, click Remove.
- To set a reference file, in the right work pane, under Video,
beside the file that you want to use as the reference file,
click the
Reference flag.
- In the right work pane, under Template, select a template or use the default template.
- Click Submit Job(s).
- In the right work pane, click Jobs Page link or, from the menu, click Jobs. On the In progress tab, you can observe VOD Monitor Inspector processing the files. When the job is complete, it appears on the Job List tab.
Cancel an in-progress job
- From the menu, click Jobs.
- Click the In progress tab.
- In the work pane, select a job.
- Click Cancel Job.
Delete a job from the Job List
If you need to, you can delete jobs from the Job List. Deleting jobs can make it easier to find jobs.
- From the menu, click Jobs.
- Click the Job List tab.
- In the work pane, select a job from the list.
- Click Delete.
Resubmit a job
If a job times out, you can resubmit it. You might also want to resubmit a job after making changes to the template, such as when adding more devices or changing the acceptable threshold.
- From the menu, click Jobs.
- Click the Job List tab.
- In the work pane, select a job from the list.
- Click Re-submit.
Related topic: Edit a template
Monitor jobs
- From the menu, click Jobs.
On the In progress tab, the jobs currently in progress appear, along with the test video, the reference video, template, date submitted, status, time elapsed, and time remaining. To view more details on the template, click
. While the videos are aligning, the status appears as Aligning. After the videos are aligned, the analysis starts.
On the Job List tab, the completed jobs appear. This tab also shows the date and time that a job was submitted and how long it took to process. If a job fails, check the Note column for more information.
Sometimes jobs fail because the frame rates of the reference and test video represent an unsupported cross-frame rate combination. In general terms, VOD Monitor Inspector supports the following cross-frame rate criteria:
- The frame rate of reference video is the same as the frame rate of test videos.
- The frame rate of reference video is two times the frame rate of test videos.
- The difference between the frame rates of the reference video and test video is less than 0.01.
- The difference between the frame rates of the reference video and two times the frame rate of the test video is less than 0.01.
In addition to the general cross-frame rate rules above, VOD Monitor Inspector has been enhanced to support a number of common cross-rate scenarios arising when comparing Drop-Frame (DF) with Non Drop-Frame (NDF) videos, including:
- 23.98 vs 24
- 24 vs 23.98
- 29.97 vs 30
- 30 vs 29.97
- 59.94 vs 60
- 60 vs 59.94
Adjust the number of jobs that appear on the Jobs tabs
The number of jobs per page appears on the In Progress and Job List tabs below the table on the work pane. You can increase or decrease the number of jobs that appear in increments of 5, from a minimum of 5 to a maximum of 50. The default is 20. Jobs also appear on the System Admin page.
- On the In Progress or Job List tab, use the Jobs per page arrows at the bottom of the work pane to select the number of jobs that appear.
Managing results
When jobs are complete, you can view the analysis results on the Results page. The threshold (as defined in the template) appears on the SSIMPLUS Viewer Score (SVS) and Encoder Performance Score (EPS) tabs. Please see the Scoring Terminology section for explanation of the scoring terms used throughout the Results page.
Now, SSIMPLUS results also include a SSIMPLUS® Banding Score (SBS) which appears as the last numeric column of the results table. It also has its own SSIMPLUS Banding Score (SBS) tab, where you can explore the plotted scores in detail and click on each frame to see its corresponding Banding Map. Banding results are also available in the Quality Buckets tab where you can see what percent of each asset is above the Color Banding threshold recommended by the SSIMWAVE research team. A SBS for each frame will automatically appear in your exported CSV file. For full descriptions of the scoring terms, please consult the Scoring Terminology section.
View the results of a full-reference job
- From the menu, click Results.
- In the left work pane, under Choose a Video, click the job on the FULL-REFERENCE tab. In the right work pane, the Asset SVS and Asset EPS scores appear in the table, the reference file (represented as a white dotted line) and encoding outputs appear on the SSIMPLUS Viewer Score (SVS) tab.
- To view the Frame Index and Score for a specific frame, hover your mouse over a colored line segment.
- To open the Reference vs Test comparison pane, click the frame segment.
- To view file similarity between the source and the quality of the encoding, click the Encoder Performance Score (EPS) tab.
- To view video bitrates, click the Rate Quality Curve tab.
- To view quality break down, click the Quality Buckets tab.
- To view the video’s statistics, click the Statistics tab. The gray box indicates the Viewer Score.
- To view the results for a different device, in the right work pane, in the table, select a device from the Select Device drop-down list. Click Apply.
View a full-reference job’s Comparison pane, Quality Maps and Banding Maps
By using the Comparison pane, you can see a visualization of
the video’s score. You can also use the Comparison pane to
compare the highest quality reference video to the test video.
In the Comparison pane, you can turn on Quality Maps to
see the test video impairments compared to the reference video
in a gray scale map. Quality Maps are gray scale presentations
of pixel-level perceptual quality that show the spatial
distribution of impairments within a frame. Quality Maps
illustrate where impairments occur at a pixel level. The maps
provide the reason behind the quality score. Dark pixels show
the impairments compared to the reference file. Areas that are
not that important, such as the area around text, might have
more white pixels. Generally, the darker the image, the lower
the score.
In the Comparison pane, you can also turn on Banding Maps to
see the areas of color banding as compared to the reference video
in a binary map. SSIMPLUS Banding Maps measure color banding presence
at a pixel level as viewed by an “expert” on an OLED TV using a
no-reference approach. The map is generated as part of one of
several steps used in computing a SSIMPLUS Banding Score (SBS). The
banding map is a binary map with white pixels showing banding
presence, and does not reflect pixel-level variations in banding
impairment visibility.
- From the menu, click Results.
- From the left work pane, on the FULL-REFERENCE tab, select a job.
- In the right work pane, on the SSIMPLUS Viewer Score (SVS) tab, click a frame segment in the graph. The reference file appears on the left and the comparison file appears on the right. At the top of the screen, the title bar shows the file information, including the Viewer Score and the encoder/transcoder performance for both files.
- To perform a manual comparison of the two files, click and drag the blue slider.
- To view a SSIMPLUS Quality Map in the
comparison pane, perform the following actions:
- On the title bar, next to the file information, click the blue arrow.
- Click Quality Maps.
- To view a SSIMPLUS Banding Map in the
comparison pane, perform the following actions:
- On the title bar, next to the file information, click the blue arrow.
- Click Banding Maps.
- To change which files appear in the comparison pane,
perform the following actions:
- On the title bar, next to the file information, click the blue arrow.
- Select the files that you want to appear.
View the results of a single-reference job
- From the menu, click Results.
- In the left work pane, under Choose a Video, click the job on the SINGLE-ENDED tab. In the right work pane, the encoding output appears on the SSIMPLUS Viewer Score (SVS) tab.
- To view the Frame Index and Score for a specific frame, hover your mouse over a line segment.
- To view quality break down, click the Quality Buckets tab.
To view the video’s mean, click the Statistics tab. The gray box indicates the Viewer Score.
Print job results
- From the menu, click Results.
- In the left work pane, select a job.
- In the right work pane, click
Print page.
- In the print window, click Print.
- To create a PDF file, click Open PDF in Preview.
Add results
- From the menu, click Results.
- In the left work pane, select a job.
- Click + Add Results.
- Navigate to the file location.
- Click OK.
Remove results
- From the menu, click Results.
- In the left work pane, select a job.
- In the right work pane, in the table, click the checkbox next to a file.
- Click Remove Results.
Delete results
- From the menu, click Results.
- In the left work pane, select a job.
- In the right work pane, in the table, click the checkbox next to a file.
- Click Delete Results.
Download a CSV report
- From the menu, click Results.
- In the left work pane, select a job.
- In the right work pane, in the table, click the checkbox next to a file.
- Click Download CSV.
Import a CSV report
You can import a CSV report into VOD Monitor Inspector to create graphical depictions of the data.
- From the menu, click Results.
- Perform one of the following actions:
- In the right work pane, click Import CSV report. Browse to the file location. Click the file. Click Open.
- On your computer, navigate to the file. Click and drag it onto the right work pane.
Export a CSV report
You can create a spreadsheet with the data from a job.
- From the menu, click Results.
- In the right work pane, click Import CSV report.
- Browse to the file location.
- Click the file.
- Click Open.
Adjust Quality Buckets
The Quality Buckets tab shows the file information in a bar graph, as opposed to the Viewer Score line graph. In this view, you can see how the video frames stack up in quality buckets. You can change the size of the buckets to increase or decrease their granularity.
- From the menu, click Results.
- In the left work pane, select either a full-reference or single-reference job.
- In the right work pane, click the Quality Buckets tab.
- To select the size of the buckets that you want to show, at the bottom of the screen, select 5, 10, or 20.
Tip: Hover your mouse over a bar to see the percentage per video in each bucket.
Managing templates
Templates are applied to jobs and define how the job completes the video analysis.
As part of creating a template, you can set the Temporal Alignment options. Sometimes, encoding tools add or delete a frame on the output file, throwing off alignment between the source and the encoded file; Temporal Alignment corrects this disparity. When comparing multiple videos with different lengths, resolutions, and frame rates, VOD Monitor Inspector starts the analysis on the same frame in each video, ensuring an accurate score. Temporal alignment is on by default.
Create a template
- From the menu, select Settings > Templates.
- In the left work pane, click + New Template.
- In the right work pane, type a template name.
- Under Viewer Mode, select either the Regular or Expert checkbox.
- From the Devices table, select the checkboxes for the devices that you want to include. To select all devices, select the Device checkbox.
- Perform one or more of the following actions:
- To add a device that is not included in the main Device list, click + Add Device. In the list, select the applicable devices. Click Apply.
- To remove a device from the list, select the device. Click X Remove Devices.
- In the right work pane, click Advanced Settings.
- Define the Quality Threshold. The baseline for excellence is 80.
- Under Settings, select the options for the template, including Peak-Signal to Noise Ratio (PSNR), Mean Squared Error (MSE), and High Dynamic Range (HDR) for elementary streams and HLG. The PSNR and MSE do not display on the UI but are available in the downloaded CSV file.
- Under Temporal Alignment, select the options for the template, including Auto Temporal Alignment, Start Frame Reference, Start Frame Test, Frames to Process.
- Under Region of Interest (ROI), if you have videos with borders that you want to exclude, define the Reference and Test x and y axis and the width and height. The job analysis provides a score for that ROI only. You must know exactly how big the borders are for the top, bottom, and ends to match the reference and source for a full-reference comparison.
- To perform a manual analysis of the source file of either full-reference or single-reference raw, uncompressed videos, under Raw Video Parameters, define the Reference and Test width, height, frame rate (f), color (c), and bit depth (b).
- For Multi Program Transport Stream, or MPTS, under Program ID (PID), define the PID for both the Reference and Test videos.
- To include a comment in the results, perform the following
actions:
- Select Remark.
- Type the comment in the field.
- If required, especially if using Watch Folders, define
where you want to store the jobs after processing them,
perform the following actions:
- Under Output Folders, click Error Folder.
- Select the location where you want to store the files. VOD Monitor Inspector moves the video files to the selected folders. A yellow triangle means the folder does not have write permissions; make sure to select a folder with write permissions.
- In the Acceptable Quality Threshold field, set the target percentage.
- Click Apply.
- Repeat steps a to c for the Acceptable Quality Folder and Unacceptable Quality Folder. The Acceptable Quality Folder is only available on premise (not in the Cloud version).
- Click Save.
Edit a template
- From the menu, select Settings > Templates.
- From the left work pane, select a template.
- Click Edit.
- Make your changes in the right work pane.
- Perform one of the following actions:
- To save the template with the same name, click Save.
- To save the template with a new name, click Save As. Type the new name for the template. Click Save.
Export a template as XML or JSON
You can export your templates if you want to create a backup version.
- From the menu, select Settings > Templates.
- Select a template.
- In the left work pane, click XML or JSON.
- Click OK.
Import a template
If you exported your template files, you can import them back into the system.
- From the menu, select Settings > Templates.
- In the left work pane, click Import.
- Navigate to the file.
- Click Open.
Delete a template
- From the menu, select Settings > Templates.
- From the left work pane, select a template.
- Click Delete.
- Click OK.
How to activate the SSIIMPLUS® Banding Score
- Choose a New Template..
- In the right side, under Advanced Settings, tick Color Banding Detection.
- Define the Color Banding Threshold.
This threshold has been preset to 20 as SSIMWAVE's research has shown that SSIMPLUS® Banding Scores below that value are imperceptible. You can learn more about the SSIMPLUS® Banding Score and its scale in the Scoring Terminology section.
Managing comparisons
VOD Monitor Inspector uses the comparison page to compare different setups, including encoders, codec, encoder configurations, and content-types. The Rate SSIMPLUS Curve shows the video Quality Score.
For example, if you have a source video that was encoded using two (or more) different encoders into a full ladder (with different resolutions and bitrates), you can compare the quality scores visually to make informed decisions.
You can also use the Comparison page to calculate SSIMPLUS
gains. The calculation uses the Bjontegaard metric for curve
fitting.
- From the menu, click Comparison.
- In the left work pane, select the source video.
- In the right work pane, type a category name for a codec.
- From the Device list, select the target viewing device.
- From the Results list, under the category, select the videos that belong to the category.
- Click Apply.
- For each codec, perform the following actions:
- Click + Add Category.
- Repeat steps 3 to 6.
- For a quantitative comparison, from the drop-down Compare menus at the bottom of right work pane, select two categories.
About Player Test
Using Player Test, you can test how a standard, open-source streaming video player responds to network conditions during video playout of your HLS streams. Use the player to see how network conditions affect the Viewer Score, bitrate, variant switches, and buffer size. Player Test enables a three-point monitoring approach in SSIMPLUS Monitor Inspector.
Test video playout with Player Test
- From the menu, click Player Test.
- In the left work pane, select a video from the Full-Reference tab.
- In the right work pane, select a Viewer Device from the list. The devices in the list are populated from the template used to run the initial job for the video. You can switch devices during playout.
- On the video, click Play.
- Select one of the following options:
- To display the immediate score of the video played thus far, click Current.
- To display the average score for the entire viewing session, click Overall.
- The SSIMPLUS Viewer Score, Playout Bitrate, Download Rate, Buffer Length, and Variant display to the right of the video and graphically on the tabs below the video. Select a tab to view the graph.
- To view information for a specific frame, hover your mouse over a line segment.
- To view the video in full screen, click
. The graph information overlays the video playout.
- In full screen mode, mouse over the video to change the graph transparency and switch between tabs.
About Watch Folders
Add a watch folder to automatically run jobs when files are added. Watch folders detect when you add a new file to the identified folders.
As with manually-run jobs, VOD Monitor Inspector requires reference and test video folders with corresponding reference videos and test videos. Create a new template or edit an existing template and identify the output folder for the videos. After VOD Monitor Inspector runs the jobs for the videos in the Watch Folder, the videos are removed from the Watch Folder.
While VOD Monitor Inspector processes the jobs, they display on the Jobs, first on the In Progress tab and then in the Jobs List, like manually-run jobs. VOD Monitor Inspector continues to run jobs until the Watch Folder is empty, scanning every 30 seconds for new files in the folders. To automate more jobs, add more video files to the Watch Folder.
You can add multiple Watch Folders.
After the VOD Monitor Inspector runs the jobs, you can check the Results folder. If you encounter errors, confirm that you have appropriate writer and move permissions, both of which are set during deployment. Contact SSIMWAVE support for further assistance at support@ssimwave.com
Add a Watch Folder
- Click Settings > Watch Folder.
- In the work pane, click + Add watch folder.
- In the left pane, to select a reference video folders ref
from the existing folder directory, perform the following
actions:
- Click the folder icon to expand the directory.
- Click the folder name to add it to the watch folder.
- Repeat this step for the test video folder.
- In the right work pane, click the reference file icon to identify the reference file. If you do not select a reference file, the job will run as a Single-Ended job.
- Select a template from the Template drop-down list.
- Click Add Watch Folder.
Manage system settings
Set the maximum number of concurrent jobs
- From the menu, click Settings > System Admin.
- In the left work pane, select the node.
- In the right work pane, under Settings, set the number of Concurrent Jobs.
Set the job timeout threshold
The job timeout threshold is not the duration defined to run a job, but the time that can elapse before a job starts. If the job does not start within the threshold, the job fails.
- From the menu, click Settings > System Admin.
- In the right work pane, under Settings, set the Job Timeout field.
Monitor jobs in cluster environments
If you are working in a cluster environment, you can use the System Admin page to monitor jobs on different nodes.
- From the menu, click Settings > System Admin.
- In the left work pane, select the node.
In the right work pane, the jobs appear. Error messages display at the top of the screen.
Support Package
VOD Monitor Inspector can fetch various information from your running system and collect it into a compressed zip file called a support package. Typically, one generates a support package for the purpose of attaching it to a problem ticket created in SSIMWAVE's VOD Monitor Support system. Support packages are sufficiently useful, however, that you may find them useful for performing data backups and/or system migrations, as well.
Currently, the following items can be included in a support package:
- Log files (active and archived)
- Configuration files
- Database export (schema and data)
- Runtime configuration
- System hardware information
Scoring Terminology
Within VOD Monitor Inspector, you have the following SSIMPLUS measurements:
- a SSIMPLUS Viewer Score (SVS),
- a SSIMPLUS Encoder Performance Score (EPS),
- an Asset SSIMPLUS Viewer Score (SVS) and Asset Encoder Performance Score (EPS),
- and a SSIMPLUS Banding Score (SBS).
SSIMPLUS Viewer Score (SVS)
A SSIMPLUS Viewer Score (SVS) is a measurement of the overall quality experienced by the viewer, as it considers both the quality of the original asset and the impact of the encoding process as perceived by the human visual system. It is the single, authoritative, value by which all video can be measured.
SSIMPLUS Encoder Performance Score (EPS)
A SSIMPLUS Encoder Performance Score (EPS) assumes that the source asset is pristine and focuses solely on measuring the degradation introduced by encoder impairments. EPS’s are best suited to judge the comparative quality of various encoder and/or encoder settings.
Both SVS and EPS scores take into account the human visual system and the device on which the asset is being viewed.
Asset SVS and EPS
In addition to frame-level measurements, SSIMPLUS VOD Monitor Inspector provides asset-level scores, which employ additional intelligence beyond the simple mathematical average of frame scores to arrive at a single score for an entire asset. An asset SSIMPLUS Viewer Score (Asset SVS), for example, provides a value that can be used to judge the overall quality of an entire asset, taking into account the lingering effect of poor quality frames, as perceived by the human visual system. The asset SVS is particularly useful when your goal is to perform source selection on a population of assets by first removing all those with unacceptably low overall quality. Similarly, an asset Encoder Performance Score (Asset EPS) can provide a single judgement of the encoder’s performance, across an entire asset.
SSIMPLUS Banding Score (SBS)
A SSIMPLUS Banding Score (SBS) is a measurement of the amount of color banding that is present within the asset. Banding, otherwise known as color contouring, is a common artifact of the video compression process and due to the quantization introduced by the video codec. It is most obvious in areas where the content calls for the rendering of smooth gradient regions, such as the sky, water or ground. In these regions, pixel values that are meant to be represented uniquely so as to provide a gentle transition from one shade or color to another are, instead, aggregated into a much smaller set of values creating discrete bands of colors. The SSIMPLUS algorithm analyzes each frame of your asset for the presence of banding and records a score indicating the severity of the banding, according to the following scale:
- [ 0- 20] Imperceptible
- [21- 40] Perceptible but not annoying
- [41- 60] Slightly annoying
- [61- 80] Annoying
- [81-100] Very annoying
The following frame is an example of an SBS of 0, meaning that there
is no discernible banding present:
This next frame is example of an SBS of 68, which would qualify as
annoying. Notice the discrete bands of colors both in the sky and the
asphalt track.
SSIMPLUS Algorithm
Introduction
SSIMPLUS stands for the “Structural Similarity Plus index” for video quality assessment (VQA). It was developed based on the Structural Similarity (SSIM) index which has been widely used in both academia and industry for perceptual quality assessment. The SSIM algorithm was recognized by the Television Academy and awarded the 67th Prime-Time Engineering Emmy Award in 2015. SSIMPLUS has improved upon SSIM in terms of video QoE accuracy, speed, applicability and flexibility. It has been the driving force behind SSIMWAVE's development of innovative software solutions which are revolutionizing the delivery of video from end to end allowing users to measure, maximize and monetize their efforts.
The algorithm was developed with the end user in mind and the objective of preserving creative intent. It enables users to access video QoE scores at any point within the video delivery chain in order to pinpoint the cause of QoE degradation.
SSIMPLUS possesses unique features that create the most accurate and most complete QoE scores, including cross-content, cross-resolution, and cross-frame rate video quality assessment. It produces three quality scores so users can gain a full understanding of any modification that causes a change in overall video QoE. These scores include the source QoE score, a device-adaptive perceptual fidelity score, and a device-adaptive uniform output QoE score, as shown in Fig. 2. The source score is a quality measurement for input/source video of an encoder/transcoder. The perceptual fidelity score is a relative quality measurement for output/test video compared with input/source video. The uniform output score is an absolute quality description for output/test video that predicts the human opinion towards its quality.
Fig. 2 A framework of SSIMPLUS
Single ended measurement
The source video quality assessment is a single ended no-reference VQA metric that takes the source video of an encoder/transcoder as the only input and provides QoE scores based on the designed perceptual quality model that simulates the human visual system. SSIMPLUS® VOD Monitor Inspector makes use of a hybrid VQA framework to capture the QoE degradation and artifacts that are most distracting to human eyes. The source QoE score allow users to have an objective evaluation of the video quality they received from content providers and leverage potential issues further up the chain.
Reference measurement
The reference based perceptual fidelity measurement is a double ended full-reference VQA metric that takes both the source and output videos and provides QoE scores for the output video based on the designed SSIM measure with enhanced capabilities. It not only takes into account the SSIM, but also considers the spatial and temporal masking effect of the human visual system and cognitive visual attention model. The perceptual fidelity scores can be used to evaluate the performance of adopted encoders or transcoders. The perceptual fidelity score is device-adaptive allowing it to account for the varying characteristics of playback devices including, but not limited to viewing distance, screen resolution, luminance, and size. Without these considerations, perceptual fidelity scores could vary drastically for the same video.
Absolute measurement
The perceptual fidelity score assumes pristine quality of the source video, however most users have no control over the source video quality. Thus, it is problematic to use perceptual fidelity scores as a quality indicator for output videos. The uniform output QoE score is designed to measure the output video quality without assuming the source video quality is perfect. It combines the source QoE and perceptual fidelity scores to describe the absolute perceptual quality of the test video. The uniform output QoE score is also device-adaptive making it the best quality indicator to examine the final output of adopted encoder or transcoders.
Detailed quality maps
SSIMPLUS accomplishes a remarkable degree of accurate localization and provides an inspection of quality down to a pixel-level granularity. The algorithm also allows its users to perform precise and in-depth evaluation, comparison, design, and optimization of their video acquisition, processing, compression, storage, reproduction, delivery, and display systems.
Cross-content video quality measurement
The major drawback of conventionally used video quality indicators, peak signal-to-noise-ratio (PSNR) or mean squared error (MSE), is the inability to align content. For example, when looking at video compression using PSNR as the measure of quality it is true that higher bitrate is equal to better quality when comparing the same content. However, when we compare simple content, such as talking heads on a news station, to complex content, such as a fast-moving football game, the same PSNR would lead to drastically different perceptual quality. SSIMPLUS scores are content agnostic and easier to use and understand as they are set on a scale from 0-100.
Cross-resolution video quality measurement
Both conventional (PSNR, MSE) and state-of-the-art (VQM, MOVIE, SSIM) full-reference VQA methods require that reference and test videos have exactly the same resolutions. However, this is not always the case, especially during the encoding ladder design. The source video has the highest resolution and has been transcoded to multiple test videos with different resolutions and bitrates. The traditional way to calculate video quality in this case is by either up-sampling the test video or down-sampling the source video. However, from the quality evaluation perspective, neither of these options are correct because both up-sampled and down-sampled videos would introduce extra artifacts. Instead of signal matching, SSIMPLUS solves this problem from the human visual system modeling perspective towards perceptual quality estimation. It has the capability to accurately and efficiently determine the device-adaptive QoE for test video.
Cross-frame rate video quality measurement
Similar to the cross-resolution feature, SSIMPLUS’ cross-frame rate capability solves the problem of reference and test videos having different frame rates. It supports different frame rate test video quality measurement compared to a maximum of 120fps of source video. This feature not only allows users to compare the performance of encoders/transcoders with different configurations, but also enables users to design a brand new encoding ladder that takes frame rate into consideration, in addition to resolution and bitrate.
Consumer and expert mode for all supported devices
Consumers and experts have drastically different behavior during video QoE inspection. Consumers look at the overall impression of video quality, but experts examine the details of quality degradation, such as loss of detail, blur, noise, false contour, blockiness and flickering. The viewing environment is also different as consumers watch videos on a TV in their living room and experts inspect videos by standing very close to a monitor. SSIMPLUS provides the consumer and expert modes for any of the supported viewing devices. For the same video and device, the expert scores are generally lower than the consumer scores.
Client-side QoE monitoring
The three scores (source QoE, perceptual fidelity, and uniform output QoE) mentioned above are primarily generated on the server side to assess the performance of the video delivery workflow. SSIMPLUS can also be deployed on the client-side to collect the final video QoE of consumers. This is a new module with capability to capture the artifacts introduced by video deliveries throughout the network, such as initial waiting, playback stalling, packet/frame drop after error concealment and quality switching.
Features
Understanding of the human visual system obtained through psychophysical studies performed by vision scientists has been evolving. The results of these studies are often empirical and cannot be directly employed in real-world applications. SSIMWAVE builds and employs advanced computational models, based on state-of-the-art research conducted in the field of vision science, to analyze the perceptual quality of video content. SSIMPLUS employs these advanced computational models to offer many features currently not provided by any other product in the industry:
Accurate & straightforward
SSIMPLUS automates real-time, accurate and easy-to-use video QoE evaluation for quality monitoring, tracking, assurance, auditing, and control. The solution provides straightforward predictions on what an average consumer would say about the quality of delivered video content on a scale of 0-100 and categorizes the quality as either bad, poor, fair, good, or excellent.
Incredible speed
Video QoE measurement is a computationally demanding task as the models that perform reasonably well are considerably slow [4, 5, 9] and cannot be employed to perform real-time video QoE measurement. Our deep understanding of perceptual video quality assessment and human visual systems enables us to design and develop accurate and fast video QoE measurement algorithms. SSIMPLUS provides an optimized monitor that performs QoE of a 4K resolution video faster than real-time.
Device adaptive
Viewing conditions and the properties of a display device have a major impact on the perceived video quality. Our computational models consider the display device and viewing conditions as input before predicting the perceptual quality of a video. SSIMPLUS provides support for most commonly used devices. A full list of available devices can be found in Appendix B. Please note that we can always add your target device in to the list because SSIMPLUS is able to handle any display device.
Cross-resolution QoE measurement
Often, the reference and distorted video inputs are not of the same resolution. SSIMPLUS has the capability to accurately and efficiently determine monitor device-adaptive QoE when a distorted video does not have the same resolution as that of the reference video.
Detailed quality maps
SSIMPLUS accomplishes a remarkable degree of accurate localization and empowers its users to perform precise and in-depth evaluation, comparison, design, and optimization of their video acquisition, processing, compression, storage, reproduction, delivery, and display systems.
Reliable, robust & easy-to-use
The product has gone through extensive detailed testing, validation and comparison using in house and publicly available subject-rated video quality assessment databases. SSIMPLUS can be easily embedded into existing visual communication systems for the determination and evaluation of resource allocation strategies based on desired visual QoE and available network resources.
Appendix
Appendix A: Comparative results
The report compares the performance of SSIMPLUS to the following most popular and widely used video quality assessment measures in industry & academia:
- Peak Signal-to-Noise Ratio (PSNR)
- Structural Similarity (SSIM) Index [8]
- Multi-Scale Structural Similarity (MS-SSIM) Index [9]
- Video Quality Metric (VQM) [4]
- MOtion-based Video Integrity Evaluation (MOVIE) Index [5]
- PQR measure by Tektronix (PQR-Tek) [3]
- DMOS measure by Tektronix (DMOS-Tek) [3]
- JND measure by Video Clarity (JND-VC) [2]
- DMOS measure by Video Clarity (DMOS-VC) [2]
Perceptual quality prediction accuracy
The ultimate goal of VQA algorithms is to predict subjective quality evaluation of a video. Therefore, the most important test is to evaluate how well they predict subjective scores. Recently, a subjective study was conducted by JCT-VC members to quantify the rate-distortion gain of the HEVC codec against a similarly configured H.264/AVC codec [1]. The database is very relevant for evaluation of video quality assessment algorithms developed for the media & entertainment industry because it contains videos distorted by most commonly used video compression standard along with the recently developed H.265 codec. We use this independent and challenging subjective database to compare the performance of the VQA algorithms in predicting the perceptual quality. The performance comparison results are provided in Table 1. For this purpose, we employ five evaluation metrics to assess the performance of VQA measures:
- Pearson linear correlation coefficient (PLCC) after a
nonlinear mapping between the subjective and objective
scores. For the i-th image in an image database of size N,
given its subjective score oi (mean opinion score (MOS) or
difference of MOS (DMOS) between reference and distorted
images)and its raw objective score ri,we first apply a
nonlinear function to ri given by [6]
where a1 to a5 are model parameters found numerically using a nonlinear regression process to maximize the correlations between subjective and objective scores. The PLCC value can then be computed as - Mean absolute error (MAE) is calculated using the
converted objective scores after the nonlinear mapping
described above:
- Root mean-squared (RMS) error is computed similarly as
- Spearman’s rank correlation coefficient (SRCC) is defined
as:
where di is the difference between the i-th image’s ranks in subjective and objective evaluations. SRCC is a nonparametric rank-based correlation metric, independent of any monotonic nonlinear mapping between subjective and objective scores. - Kendall’s rank correlation coefficient (KRCC) is another
non-parametric rank correlation metric given by
where Nc and Nd are the numbers of concordant and discordant pairs in the dataset,respectively.
Among the above metrics, PLCC, MAE and RMS are adopted to evaluate prediction accuracy [7], and SRCC and KRCC are employed to assess prediction monotonicity[7]. A better objective VQA measure should have higher PLCC, SRCC and KRCC while lower MAE and RMS values. All of these evaluation metrics are adopted from previous VQA studies [7, 6]. We can observe from the results provide in Table 1 that SSIMPLUS not only outperforms the popular VQA quality measures in terms of perceptual quality prediction accuracy but also in terms of computation time. Additionally, SSIMPLUS has many exclusive features (listed in Section 2) not offered by any other VQA measure.
Table 1: Perceptual quality prediction performance comparison
PLCC | MAE | RMS | SRCC | KRCC | Computation time(normalized) | |
---|---|---|---|---|---|---|
PSNR | 0.5408 | 1.1318 | 1.4768 | 0.5828 | 0.3987 | 1 |
MOVIE | 0.7164 | 0.9711 | 1.2249 | 0.6897 | 0.4720 | 3440.27 |
VQM | 0.8302 | 0.7771 | 0.9768 | 0.8360 | 0.6243 | 174.53 |
SSIM | 0.8422 | 0.8102 | 0.9467 | 0.8344 | 0.6279 | 22.65 |
MS-SSIM | 0.8527 | 0.7802 | 0.9174 | 0.8409 | 0.6350 | 48.49 |
SSIMPLUS | 0.8678 | 0.7160 | 0.8724 | 0.8745 | 0.6737 | 7.83 |
Device-adaptation capability
The above test results assume a single fixed viewing device,which is a common assumption made by existing state-of-the-art VQA models. The capability of SSIMPLUS is beyond the limitation of existing models. In particular, SSIMPLUS is designed to inherently consider the viewing conditions such as display device and viewing distance. Due to the unavailability of public subject-rated video quality assessment databases that contain subject-rated video sequences watched under varying viewing conditions, we performed a subjective study in order to test the device-adaptive capability of the SSIMPLUS algorithm.
Subjective study
The main purpose of the study is to observe how the state-of-the-art VQA algorithms adapt to varying viewing conditions. A set of raw videos sequences, consisting of 1080p and 640p resolutions, was compressed at various distortion levels to obtain bitstreams compliant to H.264 video compression standard. The decompressed distorted video sequences were rated by subjects under the following viewing conditions:
- Display Device: iPhone 5S, viewing distance: 10 inches;
- Display Device: iPad Air, viewing distance: 16 inches;
- Display Device: Lenovo W530 laptop, viewing distance: 20 inches;
- Display Device: Sony 55” TV, viewing distance: 90 inches;
- Display Device: Sony 55” TV, viewing distance: 20 inches (referred to as TV-Expert).
Performance comparison
The mean opinion scores (MOS) provided by subjects were used to compare the performance of SSIMPLUS with state-of-the-art VQA measures. The scatter plots of the VQA algorithms under comparison are shown in Figures 31 - 40. The superior performance of the SSIMPLUS algorithm compared to the other VQA algorithms is evident from the figures.
Comparisons between the VQA algorithms using PLCC, MAE, RMS, SRCC, and KRCC are provided in Tables 2 - 7. We can observe from the results that the SSIMPLUS algorithm outperforms other state-of-the-art video compression algorithms. The main purpose of the subjective study (refer to section A) is to observe the adaptation behavior of the state-of-the-art VQA measures when deployed for predicting the perceptual quality of video content viewed under different viewing conditions. Table 6 compares the performance of the VQA measures when the TV viewing distance is reduced to 20 inches (referred to as expert mode). SSIMPLUS adapts to the changes in the viewing conditions better than the VQA algorithms under comparison. SSIMPLUS is considerably faster than the other quality measures proposed to predict perceptual quality of video content and meets the requirements for real-time computation of perceptual video QoE and the detailed quality map.
Table 2: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: iPhone 5S, viewing distance: 10 inches)
|
Table 3: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: iPad Air, viewing distance: 16 inches)
|
Table 4: Performance comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Lenovo W530 laptop, viewing distance: 20 inches)
|
Table 5: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Samsung TV 55” viewing distance: 90 inches)
|
Table 6: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS (device: Samsung TV 55”, viewing distance: 20 inches)
|
Table 7: Performance Comparison between PSNR, SSIM, MS-SSIM, VQM, PQR-Tek, DMOS-Tek, JND-VC, DMOS-VC and SSIMPLUS including all devices
|
Appendix B: Supported devices
DEVICE OPTION | DEVICE NAME | MANUFACTURER |
---|---|---|
ssimpluscore | SSIMPLUSCore | SSIMPLUS |
iphone6plus | iPhone 6 Plus | Apple |
iphone6 | iPhone 6 | Apple |
iphone5s | iPhone 5S | Apple |
galaxys5 | Galaxy S5 | Samsung |
galaxynote4 | Galaxy Note 4 | Samsung |
onem8 | One (M8) | HTC |
nexus6 | Nexus 6 | Motorola |
ipadmini2 | iPad Mini 2 | Apple |
ipadair2 | iPad Air 2 | Apple |
galaxytabs | Galaxy Tab S | Samsung |
nexus7 | Nexus 7 | Asus |
surface | Surface | Microsoft |
f8500 | F8500 | Samsung |
vt60 | VT60 | Panasonic |
oled65g6p | OLED65G6P | LG |
h7150 | H7150 | Samsung |
as600 | AS600 | Panasonic |
ea9800 | EA9800 | LG |
x9 | X9 | Sony |
hu9000 | HU9000 | Samsung |
27mp35hq | 27MP35HQ | LG |
xl2420t | XL2420T | BenQ |
b296cl | B296CL | Acer |
vg27he | VG27HE | Asus |
lt3053 | LT3053 | Lenovo |
pa242w | PA242W | NEC |
u2713hm | U2713HM | Dell |
macbookpro | Macbook Pro | Apple |
thinkpadw540 | ThinkPad W540 | Lenovo |
aspires7 | Aspire S7 | Acer |
xps15 | XPS 15 | Dell |
economyclasshds1 | Economy Class HD S1 | Panasonic |
economyclassfhds2 | Economy Class FHD S2 | Panasonic |
economyclassfhds3 | Economy Class FHD S3 | Panasonic |
businessclassfhdm1 | Business Class FHD M1 | Panasonic |
businessclassfhdm2 | Business Class FHD M2 | Panasonic |
businessclassuhdm3 | Business Class UHD M3 | Panasonic |
firstclassfhdl1 | First Class FHD L1 | Panasonic |
firstclassuhdl2 | First Class UHD L2 | Panasonic |
tx-40cx680b | TX-40CX680B | Panasonic |
up3216q | UP3216Q | Dell |
imac275k | iMac 27 5K | Apple |
imac2154k | iMac 21.5 4K | Apple |
ue40ju6400 | UE40JU6400 | Samsung |
50put6400 | 50PUT6400 | Phillips |
kd-55x8509c | KD-55X8509C | Sony |
ue55js9000t | UE55JS9000T | Samsung |
oled55e7n | OLED55E7N | LG |
galaxys6edge | Galaxy S6 Edge | Samsung |
nexus9 | Nexus 9 | HTC |
macbookair13inch | Macbook Air 13inch | Apple |
smallscreen | SmallScreen | SSIMPLUS |
oled65c7p | OLED65C7P | LG |
oled55c7p | OLED55C7P | LG |
iphonex | iPhone X | Apple |
ipadpro | iPad Pro | Apple |
oled55c8pua | OLED55C8PUA | LG |
xbr-55a8f | XBR-55A8F | Sony |
qn55q8fnbfxza | QN55Q8FNBFXZA | SamSung |
ipad2017 | iPad 2017 | Apple |
Appendix C: Supported video formats
SSIMPLUS supports most of the popular video compression formats and video containers. The details of supported video codecs and containers can be found below.
Media Container Formats:
AVI | Audio Video Interleaved |
AVR | Audio Visual Research |
AVS | AVISynth |
DV | Digital Video |
FLV | Flash Video |
GXF | General eXchange Format) |
H261 | Raw H.261 video |
H263 | Raw H.263 video |
H264 | Raw H.264 video |
HEVC | Raw HEVC video |
IFF | Interchange File Format |
MVE | Interplay MVE |
IVR | Internet Video Recording |
LVF | DVR |
LXF | VR native stream |
M4V | Raw MPEG-4 video |
WebM | Matroska/WebM |
Mjpeg | Raw MJPEG video |
Mjpeg_2000 | Raw MJPEG 2000 video |
MOV | Apple Quicktime |
MP4 | MPEG-4 Part 14 |
3GP | 3GPP Multimedia File (GSM) |
3G2 | 3GPP Multimedia File (CDMA) |
MJ2 | Motion JPEG 2000 |
MPEG-PS | MPEG-2 Program Stream |
MPEG-TS | MPEG-2 Transport Stream |
MKV | Matroska Multimedia Container |
MPV | MPEG-2 Manzanita MP2TSME/MP2TSAE |
MXF | Material eXchange Format |
YUV | Raw Video |
V210 | Uncompressed 4:2:2 10-bit |
HDR Formats:
HDR10 | HDR10 Media Profile |
Codecs:
H.261 |
H.263 / H.263-1996, H.263+ / H.263-1998 / H.263 version 2 |
H.264 / AVC / MPEG-4 AVC / MPEG-4 part 10 |
HEVC (High Efficiency Video Coding) |
JPEG 2000 |
JPEG-LS |
MPEG-1 video |
MPEG-2 video |
MPEG-4 part 2 |
Apple ProRes |
QuickTime Animation (RLE) video |
Theora |
On2 VP3 |
On2 VP5 |
On2 VP6 |
On2 VP6 (Flash version, with alpha channel) |
On2 VP6 (Flash version) |
On2 VP7 |
On2 VP8 (decoders: vp8 libvpx vp8_cuvid ) |
Google VP9 (decoders: vp9 libvpx-vp9 vp9_cuvid ) |
Windows Media Video 7 |
Windows Media Video 9 |
No-Reference Scores:
Raw/Uncompressed Video |
H.264 |
H.265 |
MPEG-2 video |
MPEG-4 |
Apple ProRes |
About SSIMWAVE
At SSIMWAVE, science meets art to make sure each video
stream delivered makes its way to a happy subscriber. SSIMWAVE
tunes video content quality to balance feasibility with the
best experience possible. Its entrepreneurial group of scientists,
software engineers, and business professionals are fascinated
with perfecting digital video delivery. SSIMWAVE leverages its
Primetime Emmy® Award winning technology to make the unknown,
known; the subjective, objective; to reliably deliver video
quality levels subscribers expect, and are willing to pay for.
Viewers see and hear the results of SSIMWAVE’s work everywhere
– while streaming the season finale on the big screen or watching
the game-winning shot on their smartphones.
Copyright © 2020 SSIMWAVE Inc. All Rights Reserved.
SSIMWAVE, SSIMPLUS, and VOD Monitor Inspector are all trademarks
of SSIMWAVE Inc.
Google Chrome is a trademark of Google Inc.
Dolby, Dolby Vision and the double-D symbol are all trademarks
of Dolby Laboratories. Manufactured under license from Dolby
Laboratories. Confidential unpublished works. Copyright
©2013-2015 Dolby Laboratories. All rights reserved.
All other trademarks are the property of their respective owners.
References
- V. Baroncini, J. R. Ohm, and Sullivan G. J. Report on preliminary subjective testing of hevc compression capability. In JCT-VC of ITU-T SG16 WP3 and ISO/IEC JTC1/SC29/WG11, San Jose, CA, February 2012. 30
- Video Clarity. Understanding MOS, JND and PSNR. http://videoclarity.com/wpunderstandingjnddmospsnr/, 2014. [Online; accessed September 07, 2014]. 30
- Tektronix Inc. Understanding PQR, DMOS and PSNR Measurements. http://www.tek.com/ document/fact-sheet/understanding-pqr-dmos-and-psnr-measurements/, 2014. [Online; accessed September 12, 2014]. 30
- M.H.PinsonandS.Wolf.A new standardized method for objectively measuring video quality. IEEE Trans. Broadcasting, 50(3):312–322, 2004. 5, 30
- K. Seshadrinathan and A. C. Bovik. Motion tuned spatio-temporal quality assessment of natural videos. IEEE Trans. Image Processing, 19(2):335–350, February 2010. 5, 30
- H. R. Sheikh, M. Sabir, and A. C. Bovik. A statistical evaluation of recent full reference image quality assessment algorithms. IEEE Trans. Image Processing, 15(11):3440–3451, November 2006. 31, 32
- VQEG. Final report from the video quality experts group on the validation of objective models of video quality assessment. Technical report, available at http://www.vqeg.org/, Apr 2000. 31, 32
- Z.Wang,A.C.Bovik,H.R.Sheikh,andE.P.Simoncelli. Image quality assessment: From error visibility to structural similarity. IEEE Trans. Image Processing, 13(4):600–612, April 2004. 4, 30
- Z. Wang, E. P. Simoncelli, and A. C. Bovik. Multi-scale structural similarity for image quality assessment. In Proc. IEEE Asilomar Conf. on Signals, Systems, and Computers, pages 1398–1402, Pacific Grove, CA, November 2003. 5, 30
Additional Help
If you require assistance, email SSIMWAVE Support at support@ssimwave.com or call 519-489-2688.