What’s the score?

Remember that day your high school teacher handed back your test paper? You held it in your hands and your eye immediately went to the big number scrawled at the top and circled in red ink: “85,” it said.

Relief instantly washed over you. You relaxed. Because you knew: a good mark. You’d done well.

That one number said it all. It told you everything.

And that was the thinking when SSIMWAVE designed its video quality scoring system: One number that would say it all. One number that would say whether a viewer’s experience was a good one – or not.

“We wanted simplicity, something very intuitive,” says Dr. Zhou Wang, SSIMWAVE’s co-founder and Chief Science Officer.

Broadcasters, cable companies, video producers and the like, all depend on SSIMWAVE’s SSIMPLUS product to help them consistently deliver high-quality, visually satisfying video to their downstream customers.

SSIMWAVE’s product is their distant early warning system, seeing what their viewers would see. SSIMPLUS probes along the video delivery chain, looking for problems or issues that would result in a viewer being disappointed. The probes and the software are connected to collectively generate a number between zero and 100 and display it on a dashboard. A high number indicates things are working well. A low number indicates problems and points the content owner toward a solution.

The dashboard can be run on a simple laptop and can be configured in a variety of ways that helps the user to display the status of their delivery systems. Like your high school test paper, that one number says it all.

“That number quickly tells you how things are working,” says Dr. Wang.

An 85 displayed on the dashboard, for example, would indicate to a technician that a downstream user viewing a video was happy, that they were receiving a good-quality product.

By contrast, if the dashboard Viewer Score was low – 30, for example – the technician would immediately know problems were unfolding and that viewers might soon begin calling their cable companies to report problems. The SSIMWAVE software would help quickly identify where in the delivery chain the snags were occurring and highlight the steps that could be taken to achieve the needed repairs. All of it in real time. All of it before disappointed customers began logging complaints.

“Our product can show you everything,” says Dr. Wang.

The unique, valuable thing about the SSIMPLUS system is that the Viewer Score is relevant, and accurate, no matter what type of device the viewer is using, be it a large, TV-like display, or a tablet or a cell phone.

That’s important. Traditional methods of doing video quality assessment didn’t take into account the device that was in use. That skewed the assessment and rendered it less relevant.

“We found that different displays and different display environments would generate different Viewer Scores,” explains Dr. Wang. “We saw this as a major problem.”

“Starting with the human visual system, you build a model that applies to all of these different devices, even to devices you have never seen before”

Dr. Wang’s colleague, Dr. Abdul Rehman, SSIMWAVE’s CEO and co-founder, conducted experiments in the lab: He had groups of people watch the same video stream. Each group viewed the identical stream on a different kind of device. Different devices generated different results, even though the video being watched was the same.

“The thing is, it has been pretty challenging to tackle this problem,” says Dr. Wang. “Traditionally, if you look at the way people have measured video quality, it’s based on the pixels or the streams and frames per second. And then you decide if the quality is good or bad. But this is regardless of the final display on which you’re going to show the content.

“As long as you receive those pixels, everything is decided in terms of quality. That’s the traditional way people do video quality assessment. Obviously, this is not going to be very good, because with different displays and different display environments, you’ll get different Viewer Scores.

“So this motivated us to work on this display dependent quality assessment. Basically, we wanted to make our algorithms adaptable to the display and viewing conditions. Ideally, you want to adapt the quality prediction score to match exactly that viewing environment and that particular display.”

The key to the solution SSIMWAVE designed was understanding the human visual system, and how the human visual system reacts when confronted with displays that have different resolutions, frame rates and brightness qualities.

“Starting with the human visual system, you build a model that applies to all of these different devices, even to devices you have never seen before,” says Dr. Wang.

“This is a huge, huge advantage. No one else has done that, built a universal model that applies to all kinds of devices. That’s the big advantage of the SSIMPLUS measurement.”

So, many devices, but one number – a number that simply, quickly and accurately captures the quality of the experience from the user’s perspective.

“We’re running this 24/7, for thousands and thousands of channels and millions of users,” says Dr. Wang. “We’ve put so many eyes everywhere in the system to tell you exactly what is happening.

“We convert eyes into software. That one number tells you everything.”