Description
Expected Residual Time (ERT) is a client-side calculated parameter within the context of 3GPP's Dynamic Adaptive Streaming over HTTP (DASH) specifications. It represents an estimate, in seconds, of how long the currently buffered media data will sustain playback before the buffer is depleted, assuming a constant consumption rate. The ERT is not a simple measurement of buffer occupancy in bytes; it is a time-based projection that factors in the playback durations of the media segments already downloaded and stored in the buffer. This calculation is performed continuously by the DASH client's adaptation engine.
The technical computation of ERT involves the client maintaining a model of its playback buffer. As media segments are downloaded, they are placed into this buffer, each segment having a known playback duration (e.g., 2 seconds, 4 seconds). The ERT is the sum of the playback durations of all complete, playable segments currently residing in the buffer, minus any media time that has already been consumed from the segment currently being played. Advanced implementations may also account for partially downloaded segments or use weighted averaging to smooth out estimates. The client uses this ERT value as a key input to its rate adaptation algorithm.
ERT's role is central to the Quality of Experience (QoE) management for adaptive streaming. The DASH client's primary goal is to select the most appropriate representation (bitrate/quality) for the next segment to download. It must balance quality (higher bitrate) with the risk of buffer starvation (rebuffering). A high ERT indicates a healthy buffer, allowing the client to potentially request a higher-quality segment. A low or rapidly falling ERT signals an imminent risk of buffer underflow, triggering the client to proactively switch to a lower-bitrate representation to ensure the download time of the next segment is shorter than the remaining buffer time, thus preventing playback interruptions.
Purpose & Motivation
ERT was introduced to address a fundamental challenge in adaptive streaming: making intelligent, forward-looking bitrate decisions based on buffer health, rather than reacting only to past or instantaneous network conditions. Simple metrics like instantaneous throughput or current buffer level in bytes can be misleading and lead to oscillating quality or unexpected rebuffering. For instance, a buffer might contain many bytes of a low-bitrate segment (long duration) or few bytes of a high-bitrate segment (short duration), and the byte count alone doesn't reveal the true playback safety margin.
The creation of ERT was motivated by the need for a standardized, accurate predictor of playback continuity. It provides a common language and metric for QoE optimization algorithms, both in the client and for network-assisted streaming (e.g., in 5G Media Streaming). By estimating the time until buffer exhaustion, the client can make more stable and optimal adaptation decisions. This is especially critical in mobile environments where network bandwidth can be highly variable and unpredictable.
Solving the buffer prediction problem with ERT enables smoother video playback, higher average bitrates without increased rebuffering, and an overall improved user experience. It is a key enabler for the reliable delivery of high-quality streaming services over cellular networks, which is a primary use case for 4G and 5G. Its specification within 3GPP standards ensures interoperability and consistent performance across different device and server implementations.
Key Features
- Time-based estimate of remaining playback duration in the client buffer
- Core input to the DASH client's adaptive bitrate (ABR) logic
- Helps prevent buffer underflow (rebuffering) events
- Enables proactive quality switching based on future buffer state
- Calculated from the playback durations of buffered media segments
- Standardized metric for QoE reporting and network-assisted streaming
Evolution Across Releases
ERT was first introduced in 3GPP Release 8 within the context of Packet-Switched Streaming Service (PSS) and the initial DASH specifications. The initial definition established ERT as a client-side metric for buffer management, providing the foundational algorithm for estimating residual playback time to inform basic adaptation logic in early adaptive streaming services.
Enhanced DASH operation with Server and Network Assisted DASH (SAND) messaging. ERT's role was expanded as a potential metric that could be reported from the DASH client to network elements (like the SAND server) to enable network-assisted QoE optimization, allowing the network to consider client buffer health when managing resources.
Integration into the 5G Media Streaming (5GMS) framework. ERT is a key QoE metric defined for the Media Session Handler and the Application Function, enabling the 5G system to be aware of the client's playback buffer status. This allows for more advanced network-edge assisted streaming and quality adaptation, aligning with 5G's low-latency and high-reliability service goals.
Defining Specifications
| Specification | Title |
|---|---|
| TS 26.346 | 3GPP TS 26.346 |
| TS 26.852 | 3GPP TS 26.852 |
| TS 26.946 | 3GPP TS 26.946 |