MPS Logo
Tencent MPS Blog
Tencent MPS Blog

What is Live Streaming Latency and how to reduce it?

Tencent MPS - Dev Team

Analysis of the current state of the live streaming industry

The live streaming industry is currently in a rapid development stage with some of the following status and trends:

1. Audience size continues to grow: more and more people choose to watch content via live streaming platforms, and the size of the audience continues to expand. This is due to the popularity of smartphones, the improvement of network bandwidth and the promotion of live streaming platforms.

2. Diversified content: the types of content on live streaming platforms are becoming more and more diversified, covering various fields such as entertainment, sports, education, current affairs, food and travel. People can get real-time and diversified content experience through the live broadcast platform.

3. Importance of social interaction: live broadcasting platforms provide real-time social interaction features that allow viewers to interact with anchors and other viewers, expressing their opinions and emotions through pop-ups, likes and comments. This social interaction enhances the viewing experience and makes viewers more involved and engaged.

4. Commercialization and payment models: live broadcast platforms have gradually introduced commercialization and payment models. Hosts can earn income through paid gifts, advertising cooperation, brand sponsorship, etc., while viewers can buy virtual gifts to support their favorite hosts.

What is Live Streaming Latency?

Live streaming latency is the time difference between when an actual event occurs and when the viewer finally receives it. The push streaming end is captured, pre-processed and then encoded and pushed to the cloud for access. After that, the cloud goes through media processing and then CDN distribution and transmission. Finally, it is decoded and post-processed at the viewer's end and played out. The delay mainly comes from the data accumulation in the link, pushing the stream, transmitting, and downstream playback, all of which may produce data accumulation, and all of which may produce delay.

The main factors affecting the delay are as follows:

1. The choice of upstream coding parameters at the push end: If soft coding, such as X264, is used, the settings of some parameters will affect its delay, such as the number of rc lookahead and frame threads, and whether to use B-frames and the number of B-frames. As the encoding preset preset increases from very fast, fast, fast to medium, the number of lookahead ranges from 10-40 frames, and the number of frames encoded in a common push stream (FPS15), for example, will result in an additional increase in latency from 660ms to 2300ms. this is the additional latency introduced in order to improve the quality of the image and the speed of encoding. This is to improve the image quality and encoding speed, which leads to the introduction of additional delay.

2. Audio and video interleaving synchronization: When using ffmpeg, the audio and video interleaving synchronization wait causes additional delay. For example, if the video timestamps t1, t2, t3 and the audio timestamps, t0, t1, t2 are not exactly the same, there is a buffer rearrangement, which causes additional delay during the waiting process.

3. Network transmission itself has a delay RTT: For example, after a client sends a piece of data, it waits for the server and ACK, if it is not received after a timeout of 200 milliseconds, then the client will retry next time. But if the next ACK is lost again, the timeout may expand to 400 milliseconds. This is so cumulative that the overall delay will become uncontrollable.

4. Size of GOP: When a player is connected to a CDN, the CDN usually sends down data from the nearest GOP.The setting of the GOP affects the amount of data that is initially sent by the player access. For example, if the CDN's cache currently has 8 seconds when the player connects to the CDN node, then the CDN will incur an initial 8-second delay after sending those 8 seconds to the player.

5. Downstream playback jitter-resistant buffering capability: When the network is jittery, it is prone to data peaks and valleys, and the player will experience data accumulation. Suppose only 2 seconds of audio and video content is received in a unit time of 5 seconds due to network jitter. Then the player is likely to get stuck after playing these 2 seconds, and wait for the later 8 seconds of content to be received, and then play it at a normal pace, which also creates an additional 3 seconds of delay. If this delay is not dealt with effectively, then as time accumulates, the number of jitter becomes more and more, the delay will also become longer and longer.

Common protocols, such as RTMP, HTTP-FLV can basically achieve a delay of 2-3 seconds, while HLS has a delay of about 6-20 seconds, which depends on the size of the GOP, the size of the slice and the number of slices.

Why do we need to reduce live streaming latency?

The importance of low latency live streaming is also becoming more and more prominent. Low-latency live broadcasting can provide a more real-time and interactive viewing experience, which is especially important for some scenes that require real-time feedback and interaction, such as:

1. Live sports events: when watching sports matches, viewers want to be able to see the progress and results of the match in real time in order to better participate and interact.

live sports events

2. Interactive live broadcasts: Some live content requires real-time interaction between viewers and anchors, such as Q&A sessions and sweepstakes. Low latency ensures that audience questions and participation can be responded to in a timely manner.

3. Online education and training: for online education and training, low-latency live broadcast can provide better real-time interaction and learning experience, so that the communication between students and teachers is smoother.

4. Virtual Reality and Augmented Reality: Low latency live streaming is especially important for virtual reality and augmented reality applications, which ensures synchronization between virtual and real scenarios and provides a more immersive experience.

virtual reality

Therefore, low latency live streaming is very important to provide a better viewing experience, enhance interactivity and meet specific application requirements. As technology continues to advance and user demand for real-time increases, the live streaming industry will continue to strive to reduce latency and provide a better live streaming experience.

How live broadcast latency is measured?

1. End-to-end broadcasting comparison

For example, on the streaming side, the streaming side collects the time of the webpage, and then the delay can be directly obtained by comparing it on the playback side.

2. Insert customized SEI content in the streaming side, and make rough estimation by carrying local timestamp

For example, the sender puts the local timestamp into the SEI in the form of a json, and the playback side parses the SEI and compares the local time with the timestamp in the json to get the end-to-end link delay. This method requires that the local machine clocks between the two ends should not differ too much.

How to reduce live streaming latency?

To reduce live streaming latency, you can consider the following methods and techniques:

1. Use a low-latency encoder: Choose a video encoder with a low-latency encoding feature, such as Low Latency Profile for H.264 or Low Delay Mode for H.265. These encoders can reduce the delay between encoding and decoding.

2. Optimize transport protocols: Choose a transport protocol that is suitable for low latency, such as WebRTC, SRT (Secure Reliable Transport) or QUIC (Quick UDP Internet Connections). These protocols focus on reducing transmission latency and providing more reliable transmission.

3. using edge computing and CDN: By deploying edge nodes and Content Delivery Networks (CDN) around the globe, content is brought as close as possible to the viewer, reducing transmission distance and network latency.

4. Optimize network bandwidth and quality: Ensure that there is sufficient network bandwidth to support high-quality live transmissions. Use Quality of Service mechanisms to prioritize live traffic to reduce network congestion and packet loss.

5. Reduce encoding and decoding latency: Encoding and decoding latency can be reduced by adjusting encoder and decoder settings, such as reducing the GOP (Group of Pictures) size, adjusting the frame rate and resolution, and so on.

6. Optimize buffer settings: Set the buffer size reasonably to balance the latency and smoothness. A smaller buffer can reduce latency, but may lead to more frequent lag; a larger buffer can provide smoother playback, but will increase latency.

7. Use a real-time streaming protocol: Choose a streaming protocol that supports real-time, such as RTMP (Real-Time Messaging Protocol) or HLS (HTTP Live Streaming). These protocols can provide lower latency and better real-time performance.

Tencent MPS Stream

The media delivery function of Tencent MPS Stream can provide users with reliable, secure, high-speed and low-latency video delivery services.

It not only provides high-quality, stable and low-latency media delivery services for global live broadcasts based on the SRT protocol, but also furnishes a global live broadcast experience of less than 3 seconds using the LL-HLS protocol. In addition, the Stream series can provide reliable and efficient streaming media transmission, transcoding, copying, packaging and distribution services, as well as rich value-added functions for global customers.

You are welcome to Contact Us for more information.