Description
Reference Sensitivity Degradation (RSD) is a key performance indicator and conformance test parameter defined in 3GPP specifications, notably from Release 17 onwards. It measures how much a receiver's reference sensitivity—the minimum input signal power at which a specified quality (e.g., a certain block error rate) can be achieved—is worsened (degraded) by the presence of an interfering signal. The degradation is expressed in decibels (dB). For example, an RSD of 3 dB means the receiver needs a signal that is 3 dB stronger to achieve the same performance as it would without the interferer. The test involves applying a wanted signal at the reference sensitivity power level, introducing a controlled interferer at a specified offset frequency and power, and measuring the resulting error rate.
Architecturally, RSD is not a network feature but a standardized test methodology and requirement applied to User Equipment (UE) and base station (gNB) receivers. It falls under the radio performance characteristics defined in the RF requirements specifications. The key components in evaluating RSD are the test equipment that generates the precise wanted and interfering signals and the device under test (DUT) whose receiver is being characterized. The 3GPP specifications meticulously define the test conditions: the type of wanted signal (e.g., a specific reference channel), the type of interferer (e.g., an LTE carrier, a 5G NR carrier, or a generic OFDM signal), the frequency separation between them, and the power level of the interferer. The resulting permissible degradation defines the receiver's robustness.
In the network deployment and operation context, RSD is a fundamental concept for spectrum management and coexistence. Its values, defined in conformance specs, ensure that devices from different vendors have a predictable and acceptable level of performance when deployed in real-world environments with inevitable interference. This is especially critical for spectrum sharing regimes, such as Dynamic Spectrum Sharing (DSS) between 4G and 5G, or operation in shared and unlicensed spectrum (e.g., 5G NR-U). By specifying maximum allowed RSD, 3GPP guarantees that the introduction of a new service (e.g., a 5G carrier) does not catastrophically desense an existing adjacent service (e.g., an LTE carrier). Network planners use RSD models to calculate guard bands and plan carrier frequencies to ensure all services meet their quality-of-service targets.
Purpose & Motivation
RSD was introduced to formally address the complex interference scenarios that became prevalent with the dense, heterogeneous, and shared spectrum deployments of 5G and beyond. Earlier 3GPP releases had blocking and selectivity requirements, but RSD provides a more nuanced and direct measure of performance degradation in the presence of specific, often in-channel or adjacent-channel, interferers. It solves the problem of quantifying the real-world impact of interference from colocated or adjacent radio systems.
The historical driver was the proliferation of new radio access technologies (RATs) operating in the same frequency bands. For instance, with LTE and NR needing to coexist in the same band via DSS, it was essential to define how much an NR transmission could degrade the sensitivity of an LTE receiver, and vice versa. Without standardized RSD limits, one technology could render another unusable. Similarly, for operation in the 6 GHz unlicensed band for NR-U, devices must coexist not only with other NR-U devices but also with Wi-Fi. RSD requirements ensure fair and predictable coexistence.
Furthermore, as networks evolve towards Open RAN and multi-vendor deployments, standardized receiver performance metrics like RSD are more critical than ever. They provide a clear, testable benchmark that ensures interoperability. A network operator mixing radio units from vendor A with UEs from vendor B can be confident that the system will work as planned if all components meet the 3GPP RSD requirements. It mitigates the risk of vendor-specific receiver implementations being overly susceptible to certain types of interference, which could lead to unexpected coverage holes or capacity loss in a multi-vendor environment.
Key Features
- Quantifies receiver sensitivity degradation in dB due to an interferer
- Defined for various interferer types (e.g., LTE, NR, OFDM) and frequency offsets
- Key parameter for conformance testing of UE and gNB receivers
- Essential for ensuring coexistence in shared spectrum (DSS, NR-U)
- Informs network planning for carrier spacing and guard bands
- Supports reliable operation in dense, heterogeneous network deployments
Evolution Across Releases
Formally introduced Reference Sensitivity Degradation as a standardized requirement and test case in 3GPP Release 17. Defined specific test configurations and acceptable degradation limits for NR UE receivers in the presence of interference from other NR carriers, LTE carriers, and other defined signals. Established its role in enabling dynamic spectrum sharing and operation in new shared bands.
Defining Specifications
| Specification | Title |
|---|---|
| TS 23.304 | 3GPP TS 23.304 |
| TS 36.770 | 3GPP TR 36.770 |