ESR

Erroneous Seconds Ratio

Management
Introduced in Rel-10
A key performance indicator (KPI) used in 3GPP Self-Organizing Networks (SON) and minimization of drive tests (MDT). It measures the ratio of seconds with transmission errors to total seconds, providing a granular view of radio link quality for user experience and network optimization.

Description

The Erroneous Seconds Ratio (ESR) is a standardized measurement defined in 3GPP TS 26.904 for the purpose of perceptual voice and video quality assessment and, more broadly, for radio quality monitoring. It is a time-based metric that quantifies the prevalence of transmission errors affecting a service flow, typically a voice call or a video stream. An 'Erroneous Second' is defined as any one-second period during which at least one frame error or block error occurs. The ESR is then calculated as the number of Erroneous Seconds divided by the total duration of the measurement period in seconds.

In practical operation within a 3GPP network, the ESR can be measured at different points. For voice services, it is closely related to the E-model and other ITU-T quality metrics, where error patterns directly impact perceived quality (MOS score). For MDT and SON use cases, the User Equipment (UE) can be configured to log radio measurements along with application-layer quality metrics like ESR. The UE monitors its downlink (and potentially uplink) radio blocks or packet data units. If an error (e.g., a failed Hybrid Automatic Repeat Request (HARQ) process, a Radio Link Control (RLC) unrecoverable error, or an IP packet loss) is detected within a given second, that second is flagged as erroneous.

These measurements are collected by the network's management system, such as the Network Management System (NMS) or a SON server. The granular ESR data, often tagged with location (from UE GPS or network-based positioning) and other context like cell ID, provides a powerful tool for network engineers. By analyzing ESR maps or trends, they can identify geographic areas with poor radio conditions, cells with interference issues, or mobility paths with frequent handover failures. Unlike simpler metrics like average bit error rate (BER), the second-by-second nature of ESR makes it sensitive to burst errors and short-duration outages that significantly degrade user experience for real-time services.

The ESR is a component in a larger framework of quality of experience (QoE) measurements. It works in conjunction with other KPIs like throughput, delay, and jitter to give a holistic view of service performance. Its standardization in 3GPP ensures that measurements collected by UEs from different vendors are comparable, enabling effective automated and manual network optimization across multi-vendor deployments.

Purpose & Motivation

The ESR was standardized to address the need for accurate, user-centric quality measurements in mobile networks, moving beyond simple network-centric metrics like signal strength (RSRP) or channel quality (CQI). Traditional drive tests for network optimization were expensive, sporadic, and could not capture the true user experience under all conditions. The Minimization of Drive Tests (MDT) feature, introduced around 3GPP Release 10, aimed to leverage the vast number of commercial UEs as always-on measurement probes.

ESR provides a direct, quantifiable link between the physical/radio layer impairments and the perceived service quality, especially for delay-sensitive conversational services like Voice over LTE (VoLTE). Before such standardized application-aware metrics, operators struggled to pinpoint areas where, despite adequate signal coverage, intermittent errors caused poor call quality. ESR fills this gap by measuring the error events in the time domain as experienced by the application.

Its creation was motivated by the industry's shift towards data-driven, automated network optimization (SON) and the need to guarantee quality for premium services like high-definition voice and video calling. By collecting ESR from UEs in the field, operators can build precise, dynamic maps of service quality, automatically trigger optimization algorithms (e.g., adjusting antenna tilts, power settings, or handover parameters), and proactively identify degrading network elements before they impact a large number of subscribers, thereby improving overall customer satisfaction and reducing operational costs.

Key Features

  • Time-based metric measuring the density of error events (Erroneous Seconds per total seconds)
  • Directly correlates radio/link layer errors with application-layer quality of experience (QoE)
  • Used as a key input for MDT (Minimization of Drive Tests) and SON (Self-Organizing Networks) functionalities
  • Can be measured and logged by the UE, providing granular, location-tagged performance data
  • Standardized definition ensures multi-vendor interoperability for measurement collection and analysis
  • Particularly relevant for assessing quality of real-time services like VoLTE and ViLTE

Evolution Across Releases

Rel-10 Initial

Introduced in TS 26.904 as part of the enhanced framework for MDT and QoE measurements. Defined the fundamental concept of an Erroneous Second and the ESR calculation. Specified its applicability for voice service quality monitoring and integration into UE measurement logging for network management and optimization.

Defining Specifications

SpecificationTitle
TS 26.904 3GPP TS 26.904