Description
The Channel Decoder (CHD) is a core processing block within the physical layer receiver chain of User Equipment (UE) and base stations (e.g., eNB, gNB). Its primary function is the inverse operation of the channel encoder at the transmitter. It takes the soft-decision or hard-decision symbols (log-likelihood ratios, or LLRs, are common inputs) that have been demodulated from the received radio signal and applies sophisticated decoding algorithms to reconstruct the original transport block or code block data. This process is essential because the transmitted signal is invariably corrupted by noise, interference, and fading effects inherent in the wireless medium. The decoder uses the structured redundancy (parity bits) intentionally added by the channel encoder to detect and correct bit errors, striving to output an error-free version of the information bits sent by the higher layers.
The operation of the CHD is tightly coupled with the specific channel coding scheme mandated by the 3GPP specifications for different channels and data types. In LTE (from Rel-8) and 5G NR, the primary coding schemes include Turbo coding for data channels (e.g., PDSCH, PUSCH), Tail-Biting Convolutional Coding (TBCC) for control channels (e.g., PDCCH), and Polar coding for specific control channels in NR (e.g., PBCH). Each scheme requires a dedicated decoder implementation. For instance, a Turbo decoder typically employs an iterative algorithm using two constituent decoders (often based on the MAP or Log-MAP algorithm) that exchange extrinsic information to converge on the most likely transmitted sequence. A Viterbi decoder is used for TBCC, performing maximum likelihood sequence estimation via a trellis search.
Architecturally, the CHD sits after the demodulator and before the descrambler and higher-layer processing modules. Its performance is characterized by metrics like Bit Error Rate (BER), Block Error Rate (BLER), and computational complexity. The choice of decoding algorithm and its implementation (e.g., number of iterations for Turbo decoding) involves a trade-off between error correction capability, processing latency, and power consumption—critical considerations for UE design. Advanced implementations may employ hybrid Automatic Repeat Request (HARQ) combining, where the decoder uses previously received erroneous versions of a packet along with retransmissions to improve decoding success. The CHD is, therefore, not a single entity but a category of decoders whose exact instantiation is determined by the physical channel being processed and the associated 3GPP technical specifications (TS).
Purpose & Motivation
The Channel Decoder exists to ensure reliable digital communication over inherently unreliable wireless channels. Without it, the high bit error rates caused by noise, multipath fading, and interference would make practical data services impossible. Its purpose is to leverage information theory principles, specifically channel coding (forward error correction), to reconstruct the transmitted data with a probability of error that meets the stringent Quality of Service (QoS) requirements of voice, video, and data applications. It solves the fundamental problem of the physical layer: transforming a noisy, analog waveform back into the precise digital bits generated by the source.
Historically, the motivation for advanced channel decoders like the Turbo decoder (introduced in 3G UMTS and carried forward) was to approach the Shannon limit of channel capacity—the theoretical maximum rate of reliable communication over a noisy channel. Prior to Turbo codes, convolutional and Reed-Solomon codes were used, but they left a significant gap to the Shannon limit. The introduction of Turbo coding in 3G represented a major leap, enabling higher data rates for the same signal-to-noise ratio or maintaining rates in poorer channel conditions. For 5G NR, the introduction of Polar codes for control channels addressed the need for ultra-reliable low-latency communication (URLLC) scenarios, as Polar codes can achieve superior performance at short block lengths compared to other schemes. Thus, the evolution of CHD technology is directly driven by the need for higher spectral efficiency, lower latency, and support for diverse service requirements.
Key Features
- Implements inverse function of channel encoding to recover original information bits
- Supports multiple decoding algorithms (e.g., for Turbo, Convolutional, LDPC, Polar codes) as per 3GPP specs
- Processes soft-decision inputs (Log-Likelihood Ratios) to maximize decoding performance
- Corrects transmission errors introduced by channel noise, fading, and interference
- Enables Hybrid ARQ (HARQ) processes by decoding and combining retransmissions
- Critical for achieving target Block Error Rate (BLER) and link reliability
Evolution Across Releases
Introduced as a fundamental component for LTE. The initial architecture specified Turbo coding (with QPP interleaver) as the primary scheme for data channels (DL-SCH, UL-SCH) and Tail-Biting Convolutional Coding (TBCC) for control channels. The CHD in Rel-8 enabled the high peak data rates and spectral efficiency targets of LTE, supporting the decoding processes defined in specifications like TS 36.212 (multiplexing and channel coding).
Defining Specifications
| Specification | Title |
|---|---|
| TS 26.071 | 3GPP TS 26.071 |
| TS 26.093 | 3GPP TS 26.093 |
| TS 26.171 | 3GPP TS 26.171 |
| TS 26.193 | 3GPP TS 26.193 |