By John Marshall, Product Manager at Kollective
The lights are up, cameras are rolling, and your CEO has just begun to announce the company’s next big initiative. How many people are watching? Are they having a good viewing experience? These are the two most common questions asked about a live video event, and the reality is that most event teams, IT network, and corporate communications professionals have a very hard time answering them.
Knowing, in real-time, who has joined a given broadcast is the first critical piece of data to have. You’re live streaming one of the most important messages in the company’s history and someone must be able to say whether everyone can watch and participate.
Analytics solution requirements
To gather analytics about your live event, here’s what you first need from a technical standpoint:
- A reliable data source to send intelligence from the network edge.
- A real-time data pipeline to process and store the intelligence, which can often arrive as an avalanche of data.
- An analytics platform that properly calculates and visualizes your data so that it’s not just presented as numbers, but provides contextual, valuable insights and interpretations.
In the past, data would be processed in batch jobs meaning that a bunch of reports would come in, be put into a queue and over a period of time (sometimes all night long), would be written to the database and made available for analytics. Modern data pipelines allow you to stream data directly into the database and make it available for analytics with extremely low overhead – think seconds, not hours. Low latency data pipelines allow you to know exactly how many people are online as soon as your CEO starts speaking.
Establishing reliable, real-time data collection from every individual viewer that joins a video event can be incredibly challenging and often requires the integration of multiple different sources. Robust analytics platforms, like Kollective Analytics, provide a nexus of data from every single user and can work with multiple different video applications and player technologies. That data is then collected, normalized and made available for deep analysis, correlating player experience to network performance. Data reliability, capture and cleansing is key to being able to answer the critical questions about your video event with confidence.
Now that your data is available in real-time, what does it tell you? You may now know how many people are watching the event, but being able to tell how audience volumes are trending over time and how they relate to the quality of users’ experience is where real value lies.
Blending volume metrics with Quality of Experience (QoE) indicators allow you to understand how well your audience is engaging with the content. To do this, you need an analytics solution that captures player statistics such as buffering frequency as well as information about the performance of network delivery. You can mash these up to correlate attendee volume with network efficiency and end-user experience. This holistic view helps all parties involved in a large event, including IT network and internal communications teams, to understand the reach and QoE for every single user that has joined the event.
Putting it all together
At Kollective, we focus on providing customers with business-critical intelligence before, during and after the live broadcast. We have data gathering agents at the edge and a powerful pipeline that combines data, in real-time, from additional inputs such as the video player itself. All that data is then delivered in an intuitive and highly configurable analytics tool. Live broadcast event teams use this tool to know precisely what is happening at any given moment of a live broadcast and react accordingly.
Live broadcasts are stressful enough. Is the stream stable? Are people able to connect? Is your message resonating? Take the guesswork out of the equation by implementing a comprehensive, integrated analytics solution that delivers precision and confidence.