FER

Frame Erasure Rate / Frame Error Rate

Physical Layer
Introduced in R99
Frame Erasure Rate (FER) or Frame Error Rate is a measured ratio of erroneous or discarded data frames to total transmitted frames. It is a critical real-world Key Performance Indicator (KPI) for assessing the quality and reliability of a radio link across all 3GPP technologies, from UMTS to 5G NR.

Description

The Frame Erasure Rate (FER), also commonly referred to as Frame Error Rate, is a fundamental measured performance metric in 3GPP systems. It is defined as the ratio of the number of data frames received with uncorrectable errors (and therefore typically discarded or 'erased') to the total number of frames transmitted over a given period. Unlike the predictive FEP, FER is an empirical, post-facto measurement of actual link performance. It is calculated by the receiver (either User Equipment or base station) after cyclic redundancy check (CRC) verification fails for a received transport block or frame.

The measurement of FER occurs at various protocol layers, most notably at the physical layer for transport blocks and at the Radio Link Control (RLC) layer for data packets. At the physical layer, a block error rate (BLER) is often measured, which is conceptually similar to FER for a given transport block size. The network uses FER measurements reported by the UE (e.g., in Channel Quality Indicators - CQI, or out-of-sync reports) and its own measurements to make critical radio resource management decisions. These include triggering handovers, adjusting modulation and coding schemes (MCS) via link adaptation, and modifying power control targets.

FER specifications are spread across numerous 3GPP documents covering service requirements (22-series), technical specifications (25-series for UTRA, 38-series for NR), and performance aspects. These specs define target FER values for different services (e.g., voice, video, data) under various channel conditions. For example, for circuit-switched voice, a FER below 1% might be targeted to maintain toll-quality. The FER is a direct driver of the user-perceived quality; a high FER results in garbled audio, frozen video, or slow data throughput due to retransmissions and TCP congestion control. Therefore, continuous monitoring and minimization of FER is a primary goal of the radio access network's operation and optimization processes.

Purpose & Motivation

FER exists as a universal, tangible metric to quantify the success rate of data transmission over the inherently unreliable wireless medium. Its purpose is to provide network operators, equipment vendors, and standardization bodies with a common, measurable gauge of link quality. This allows for performance benchmarking, troubleshooting, and ensuring that defined quality of service (QoS) levels are met. It addresses the fundamental challenge of translating physical layer impairments (noise, interference, fading) into a service-impact metric that can be used for system control.

Historically, as cellular technology evolved from analog to digital (GSM), the concept of frame-based transmission necessitated an error rate metric for frames. With the introduction of packet-switched services in GPRS, UMTS, and beyond, the importance of FER grew, as data services are more sensitive to errors than voice. It solved the problem of having an objective, layer-2 measure of reliability that could be tied directly to higher-layer protocols (like TCP) and user experience. The extensive specification of FER targets across releases ensures backward compatibility and forward-looking performance goals, driving continuous improvement in receiver design, coding techniques, and network algorithms to achieve lower FERs and thus higher spectral efficiency.

Key Features

  • Empirically measured ratio of erroneous frames to total frames
  • Core KPI for radio link quality assessment across 3GPP generations
  • Used as feedback for power control and link adaptation algorithms
  • Defines QoS targets for different service types (e.g., voice, video)
  • Measured at multiple layers (Physical layer BLER, RLC layer)
  • Directly impacts user experience and effective data throughput

Evolution Across Releases

R99 Initial

FER was established as a core performance metric for the new WCDMA-based UMTS system. Specifications defined FER requirements for dedicated channels (DCH), common channels, and for the acceptance testing of User Equipment. It was integral to the new fast power control and soft handover mechanisms.

Defining Specifications

SpecificationTitle
TS 21.905 3GPP TS 21.905
TS 22.105 3GPP TS 22.105
TS 23.107 3GPP TS 23.107
TS 23.171 3GPP TS 23.171
TS 23.207 3GPP TS 23.207
TS 23.271 3GPP TS 23.271
TS 25.101 3GPP TS 25.101
TS 25.102 3GPP TS 25.102
TS 25.103 3GPP TS 25.103
TS 25.104 3GPP TS 25.104
TS 25.105 3GPP TS 25.105
TS 25.123 3GPP TS 25.123
TS 25.133 3GPP TS 25.133
TS 25.141 3GPP TS 25.141
TS 25.201 3GPP TS 25.201
TS 25.212 3GPP TS 25.212
TS 25.222 3GPP TS 25.222
TS 26.935 3GPP TS 26.935
TS 26.936 3GPP TS 26.936
TS 26.952 3GPP TS 26.952
TS 29.116 3GPP TS 29.116
TS 45.903 3GPP TR 45.903
TS 45.913 3GPP TR 45.913
TS 45.914 3GPP TR 45.914