Every moment of every day across the globe, Conviva provides measurement and analytics services to many of the world’s leading streaming TV publishers and distributors. At the heart of these measurements and services, Conviva’s rich body of Quality of Experience (QoE) and audience metrics empower accurate, timely, and actionable intelligence to maximize viewer engagement and satisfaction.
Dissecting the common structure and body of each metric reduces the complexity of the metric definition, and clarifies the how the metric can be interpreted and correlated for actionable, data-driven insights. We have upgraded our metric definitions and compiled them to a single reference for customers (the Metric Dictionary in Conviva’s Experience Insights Help Center) that clarifies the definitions of each key metric, along with the metric’s viewer experience impact, session calculation, data aggregation, and related distributions.
Viewer Experience Impact
On the surface, the most important aspect of any metrics is the measured impact on viewer experience. Viewer impact may be clearly visible in the form of the dreaded spinning wheel caused by rebuffering, or less perceptible as with the correlations between play duration and rebuffering ratios.
As an example, Video Start Failures measures how often Attempts terminated during video startup before the first video frame was played, with a reported fatal error.
Once the data points of the viewer impact are captured, the brains of a metric process the data. With some metrics this calculation is basic summation (Attempts), while for others the computation is more complex with several sequences of data processing (Average Bitrate).
Furthermore, to derive a complete metric value the calculations aggregate the individual the video session with data across multiple sessions. This aggregation often consists of summing the individual sessions and then computing a percentage or average that represents the metric value for all the sessions in the selected period.
Distributions and Correlations
Distributions and correlations further enhance Conviva’s metrics to help you correlate viewer experience and video session performance, e.g., play duration and end plays with rebuffering tolerance.
Additional actionable intelligence may also come from grouping sets of individual. For example, in many cases the pairing of Average Bitrate, Rebuffering Ratio, and Video Start Failure can illustrate an optimized balance between viewer experience and video delivery performance.
Join Conviva’s developer community, where we will present one of our key viewer engagement or QoE metrics as the metric-of-the-month so you can get to know our metrics better.