Video Latency Reduction in AWS

BY:

Dec 12, 2018

What is video latency?

Suppose you are watching an award show through a streaming service. Meanwhile, your neighbor is watching on a traditional television and starts loudly celebrating the fact that their favorite show won, leaving you to wait another thirty seconds to see the award. Or worse, you get a Twitter notification spoiling the winner 15 seconds beforehand, killing the anticipation you had built up. This is video latency – the gap in between when an event is broadcasted and when you receive it.

A number of steps in the glass-to-glass journey affect video latency:

  • Video encoding pipeline duration
  • Ingest and packaging operations
  • Network propagation and transport protocol
  • Content delivery network (CDN)
  • Segment length
  • Player policies (buffering, playhead positioning, resilience)

With traditional adaptive bitrate streaming, video latency mainly depends on media segment length. For example, if your media segments are six seconds long, your player will already be six seconds late compared to the actual absolute time when it requests the first segment.

This is worsened by buffering before the actual start of the playback, which will increase the time for the first decoded video frame. Much of this latency is due to player policies and issues rather than pipeline problems.

Minimizing video latency in live streaming

Live streaming video latency can be reduced and minimized by video providers if they consider certain issues that may not be obvious at first.

The first of these is Flash and Real-Time Messaging Protocol (RTMP). Flash-based applications using RTMP streaming used to work well for video latency, but with the deprecation of Flash and web browsers steadily reducing support for the plug-in, Content Delivery Networks have also begun deprecating RTMP, forcing content providers to take alternative action.

The second is the conflict between scale and reliability and video latency. The larger a network is and the further videos need to be distributed can affect video latency. An effective way to reduce this is switching to HTML5-friendly streaming technologies, such as HTTP Live Streaming (HLS), Dynamic Adaptive Streaming over HTTP (DASH or MPEG-DASH), and the Common Media Application Format (CMAF). These streaming technologies distribute media over HTTP, making them cacheable so that CDNs can deliver more efficiently.

Best practices

There are a few simple steps media providers can take to lower their video latency:

  • Measure video latency at every step in the workflow
  • Optimize your video encoding pipeline
  • Choose the right segment duration for your requirements
  • Build the appropriate architecture
  • Optimize (or replace) your video player(s)

Additionally, it is important to choose the right segment for video packaging. For example, you can achieve five-second latency with one-second duration segments. Choose two-second duration segments, and the result will be between seven and 10 seconds of video latency – unless you optimize the player’s settings. But this can vary based on your requirements. So, if video latency of seven seconds or below is not critical, two-second segments might be better for you. If your player uses two-second segments, raising the GOP length from one to two seconds will increase the encoding quality at a constant bitrate.

Amazon Elemental

All of these problems can be solved using AWS Elemental – a recently available video streaming service from Amazon Web Services. AWS Elemental and similar cloud-based media services can reduce latency in live streaming by offering storage solutions, segment reduction, timed DVR windows, HTTP live-streaming, DASH or CMAF.

Elemental will allow you to modify your media needs no matter the scale while maintaining reliability and offering fully-managed cloud services tailored to your needs.

Related Blogs

  • On a blue background are the letters AI in white. Superimposed over this is a robotic face wrapped with a connectivity symbol. A word bubble says can I help you." alt="">
    How AI Chatbots are Changing the Call Center Game

    With Amazon Lex Chatbots and Contact Lens, you can create tomorrow’s customer experience today. Key Takeaways: AI chatbots are revolutionizing not just call centers, but the way customer service is...

    Learn More
  • Six office workers sit at computers on opposite sides of a table in a brightly lit office." alt="">
    4 Challenges of Managing WorkSpaces at Scale and How to Solve Them

    A skills gap in IT staff and rising employee costs make managed services the right choice Key Takeaways: Research says it takes a minimum of 10 dedicated full-time IT staff members to manage DaaS...

    Learn More
  • The customer call center in an AI world." alt="">
    How Artificial Intelligence Is Reinventing the Call Center

    Today’s most important AI call center trends – is your call center ready for them?  Key Takeaways:  Learn how to improve customer service operations using automation while still providing...

    Learn More