Participants in the video industry like to use the term “Quality of Experience,” but what does that term mean? Whose experience are they talking about?

Is it that of the engineers who monitor the video delivery chain? Is it the executives and content makers who decide what to deliver?

Or is it that of the end-viewer, the person sitting on the couch in their living room?

SSIMWAVE believes QoE starts and ends with the latter. However, there is so much confusion about what that term really means (many mixing it with QoS and network metrics) that we stopped using the term “QoE”, and instead call it a Viewer Score. After all, it’s for the benefit of the viewer that a video exists and it’s their satisfaction that ultimately drives the entire industry.

A true Viewer Score must therefore reflect the experience of end-viewers. It must “see” and predict video quality the way that the human visual system “sees.”

camera man looking through camera lensesMany intermediate variables that are measurable at different points along the video delivery chain can influence the end-viewers’ “QoE”, but they should not be counted as a true metric unless their impact is properly translated to a quantitative measure on how subscribers’’ visual experience is affected ­– a score, in other words.

Moreover, any automated system aiming to measure true Viewer Score must perceive what the end-viewers perceive and say what end-viewers say about the quality of their experience. For example, a true Viewer Score metric should tell if a video is of excellent, good, fair, poor or bad quality, just like a human.

Video quality measures that do not take into account perceptual differences due to viewing conditions, such as viewer device type and size, video frame rate, and pixel resolution versus device viewing resolution, are not true Viewer Score metrics. Video-stream-parameter and device-playback-behaviour based measures, such as statistics of the average receiving bitrate and the duration/frequency of video freezing events, are merely QoS metrics at the device, or at best pseudo-QoE metrics, because they do not “see” the actual pixels of each video frame like humans and are not capable of capturing the large impact of both content variations and compression artifacts on visual quality.

Quality of experience is about the quality of the viewers’ experience, not network measures. Viewers decide what’s good enough — or not and our Viewer Score is the metric that represents their experience.

The original article was published on Medium.com. Jan 4, 2019.