TRS

Total Radiated Sensitivity

Physical Layer
Introduced in Rel-7
A key performance metric for wireless devices, measuring the overall receiver sensitivity by accounting for the combined effect of the antenna and the RF receiver chain. It quantifies the weakest signal a device can reliably receive from any direction in three-dimensional space, critical for real-world performance.

Description

Total Radiated Sensitivity (TRS), also referred to as Total Integrated Sensitivity, is a comprehensive Over-The-Air (OTA) performance metric defined in 3GPP specifications for evaluating the receiver performance of User Equipment (UE) and, in some contexts, base stations. Unlike traditional conducted sensitivity measurements which test the RF receiver port in isolation, TRS measures the sensitivity of the entire device as a system, including the effects of the antenna(s), antenna efficiency, radiation pattern, and the receiver circuitry itself. The measurement is performed in an anechoic chamber using a calibrated test system that illuminates the device under test with a known, controlled RF signal from various angles of arrival.

The measurement procedure involves placing the UE on a positioning system that rotates it through spherical coordinates (azimuth and elevation). At each orientation, the test system transmits a reference measurement channel (e.g., a Physical Downlink Shared Channel (PDSCH) for downlink sensitivity) at a specific power level. The UE's receiver attempts to decode this signal, and the Bit Error Rate (BER) or Block Error Rate (BLER) is measured. The TRS is defined as the minimum incident power level, integrated over the entire sphere, at which the UE meets a predefined performance criterion (e.g., a maximum allowed BLER, typically 1% for data channels). It is usually expressed in dBm. The result is a single figure of merit that captures the device's ability to receive weak signals from any direction, which is how devices operate in real environments with multipath and changing orientation.

Mathematically, TRS is derived from the spherical integration of the sensitivity measured at each point on the radiation sphere. It inherently factors in antenna gain pattern imperfections and losses. A device with excellent conducted sensitivity but a poor, inefficient antenna will have a degraded TRS. This makes TRS a far more realistic indicator of real-world performance, especially for handheld devices whose orientation relative to a base station is unpredictable. 3GPP specifications define detailed test setups, calibration methods, and performance requirements for TRS across different frequency bands and technologies (LTE, NR). It is a mandatory conformance test for UE certification, ensuring a baseline level of receiver performance for all commercially deployed devices.

Purpose & Motivation

TRS was introduced to address a significant gap in device performance evaluation: the disconnect between ideal lab-based conducted measurements and real-world user experience. Conducted sensitivity tests, which connect a cable directly to the device's RF port, bypass the antenna. This method fails to account for antenna design trade-offs, integration challenges, and the impact of the device's casing and user's hand (handgrip effect) on reception. A device could pass conducted tests but perform poorly in actual use due to a suboptimal antenna.

The creation of TRS as a standardized OTA metric was motivated by the need to guarantee minimum real-world receiver performance for end-users, ensuring reliable network connectivity and consistent quality of service. It became increasingly critical with the proliferation of compact form-factor devices (smartphones, IoT modules, wearables) where antenna design is severely constrained by size, industrial design, and the presence of multiple radios (2G/3G/4G/5G, Wi-Fi, Bluetooth, GNSS). TRS ensures that manufacturers optimize the entire receive chain, not just the RF chipset.

Furthermore, as networks deployed advanced techniques like MIMO and carrier aggregation, which rely on multiple antennas, ensuring the performance of each antenna path became vital. TRS testing, often performed per antenna port or in MIMO configurations (e.g., Total Radiated Sensitivity for MIMO), helps validate that diversity and MIMO gains are achievable in practice. It is a key tool for network operators during device acceptance testing to avoid deploying devices that would degrade network performance by requiring higher base station transmit power to compensate for poor UE reception, thereby reducing overall network capacity and coverage.

Key Features

  • Measures overall receiver sensitivity including antenna effects via Over-The-Air testing
  • Provides a single figure of merit (dBm) representing spherical integrated sensitivity
  • Essential for real-world performance validation, beyond ideal conducted tests
  • Mandatory conformance test for 3GPP UE certification across multiple releases
  • Accounts for device orientation and antenna radiation pattern imperfections
  • Critical for evaluating performance of devices with multiple antennas and MIMO capabilities

Evolution Across Releases

Rel-7 Initial

Initially introduced for LTE UE performance characterization. Defined fundamental OTA measurement concepts and methodologies for TRS in anechoic chamber setups. Established it as a key metric for evaluating the radiated receiver performance of handheld devices.

Enhanced test methodologies and introduced requirements for multi-antenna reception scenarios, aligning with the deployment of MIMO in LTE. Started to address the measurement complexities for devices with multiple receive antennas.

Updated and extended TRS requirements for 5G NR devices, covering new frequency ranges (FR1 and FR2). Introduced specific test models and reference channels for NR. Addressed challenges in measuring TRS for mmWave devices with beamforming capabilities, though primary beamformed OTA tests were defined separately.

Further refined NR TRS test procedures, including for UE with uplink MIMO and supplementary uplink (SUL). Enhanced support for testing in wider bandwidths and with carrier aggregation configurations.

Continued evolution to cover new NR features and bands. Addressed testing for reduced capability (RedCap) NR devices, ensuring appropriate TRS requirements for IoT-type terminals. Refinements for consistency across different device form factors.

Ongoing updates to TRS specifications to keep pace with 5G-Advanced features, potentially including enhancements for integrated sensing and communication and further refined testing for advanced antenna systems.

Defining Specifications

SpecificationTitle
TS 22.889 3GPP TS 22.889
TS 22.989 3GPP TS 22.989
TS 25.144 3GPP TS 25.144
TS 25.914 3GPP TS 25.914
TS 37.144 3GPP TR 37.144
TS 37.544 3GPP TR 37.544
TS 37.902 3GPP TR 37.902
TS 38.106 3GPP TR 38.106
TS 38.161 3GPP TR 38.161
TS 38.212 3GPP TR 38.212
TS 38.300 3GPP TR 38.300
TS 38.304 3GPP TR 38.304
TS 38.321 3GPP TR 38.321
TS 38.561 3GPP TR 38.561
TS 38.834 3GPP TR 38.834
TS 38.870 3GPP TR 38.870