A billion sensor points. The number alone is remarkable, but when you consider that each sensor is continuously collecting and processing data, including content and ad metrics, from any device that can play video, that number becomes downright astonishing.
What is the sensor?
Conviva’s Stream Sensor is the basis for all our measurement. The sensor is an SDK, not API, so it does much of the heavy lifting and auto-collection without causing a problem with data quality. It integrates directly into the video player of an application, without compromising performance, and measures second-by-second viewership of content and ads, including content, device, geo, and ad metadata.
All data is cleaned and normalized into a unified, consistent data model. We then compute a wide range of streaming-specific metrics and apply a household identity model to assign a Stream ID to each physical household. The data is optimized for data warehouses or data lakes and available for instant access across our platform and through our APIs.
What problems does the sensor help overcome?
At first, content on the internet was simple binary, then evolved into images, and now into video. Video is far more complicated than the content types before it—and there are a lot of data points.
Everything from watch time to attempts, video start failures, and myriad other metrics to device, operating system, and ad abandonment can be collected from video plays. Because of that, there’s a lot of information and that has a direct impact on how people process and consume data and ultimately, make decisions.
The biggest problem is making sure the data is accurate and consistent. That’s where the Stream Sensor comes in. We collect and normalize nearly 2 trillion streaming data events daily across 3.3 billion applications streaming on devices—and keep it consistent.
How does Conviva solve these problems?
We use a heartbeat method, which is more stable, especially if any data is lost. Because we collect the sensor data at regular intervals with all the event states, if something is lost, we can always retrieve the states, thus having a miniscule impact on processed metrics.
With the rich data set that we collect, we provide intuitive and elaborate insights regarding content, advertising, quality of experience, and viewer and audience metrics to make data-drive decisions in all aspects of a business.
In addition to the sensor, Conviva provides tools that help developers and technical teams integrate, validate, and go live, such as:
- Real-time debugging to understand if integrations and metrics are accurate immediately
- Metadata quality and hygiene to help guide business decisions
- AI-alerting to anticipate and address issues in real-time before they become customer service problems
- A portal that can help know the impact of metrics and data quality before pushing to production and if something is wrong, how to fix it
What’s next for the sensor?
Publishers are exhausted implementing all the integrations necessary in this streaming-first world, so firstly, we’re working on a plug-and-play sensor that can be used across all of our products and all the main use cases. In the case of supporting out-of-the-box video players, we have modules. For cross-platform players or customized players, we have SDKs to support the main programming languages, providing coverage to 100% of the devices.
We’re also focused on automatically collecting as much as metadata as possible as well as adding new features and measurement techniques, so we’re constantly updating as new metrics are available through devices.
Another big next step we’re doing now is opening up the data set to our customers so they can use it for ad sale partnerships, to increase audience engagement, and boost revenue. The Trade Desk partnership is a great example of this.
While integrating with a billion sensor points is indeed a challenge, it’s a challenge that we embrace and tackle head-on every day for our customers and theirs.
See the sensor in action. Request a demo now.