Description
Detection Miss Probability (DMP) is a statistical metric used to evaluate the performance of detection algorithms at the physical layer of wireless communication systems like LTE and 5G New Radio (NR). It quantifies the likelihood that a receiver (UE or base station) fails to detect the presence of a specific signal or fails to correctly identify its parameters when it is actually present. Common applications include the detection of synchronization signals (e.g., Primary Synchronization Signal (PSS)/Secondary Synchronization Signal (SSS) in NR), reference signals (e.g., Channel State Information Reference Signal (CSI-RS)), or specific control channel elements.
The calculation of DMP is inherently linked to the receiver's signal processing chain. When a signal is transmitted with a certain power and structure, the receiver samples the radio channel, applies filtering, correlation, and hypothesis testing. DMP is the probability that, despite the signal being present and above a certain signal-to-noise ratio (SNR) threshold, the receiver's detection algorithm outputs a 'miss' (i.e., fails to declare detection). This can be due to channel impairments (fading, interference), noise, or limitations in the detection algorithm's sensitivity and threshold settings.
In system design and standardization, DMP is a crucial parameter for defining requirements. 3GPP technical specifications (like the 38.151 series for NR RF requirements) define minimum performance requirements for UE and base station receivers, often specifying maximum allowable DMP values under defined channel conditions (e.g., Additive White Gaussian Noise (AWGN) or fading channels). For example, a specification may state that for a given SNR, the DMP for detecting the NR-PSS during initial cell search must be less than 1%. These requirements ensure interoperability and a baseline level of network coverage and reliability.
DMP plays a vital role in higher-layer procedures. A missed detection of a synchronization signal can delay or prevent cell attachment. A missed detection of a beam-specific reference signal can lead to failed beamforming and reduced throughput. Therefore, optimizing physical layer design to minimize DMP—through robust signal design, advanced receiver algorithms (e.g., enhanced filtering, machine learning-based detection), and sufficient transmission power—is fundamental to achieving the low-latency and high-reliability targets of 5G and beyond.
Purpose & Motivation
DMP exists as a fundamental metric to quantify and standardize the reliability of low-level signal detection in noisy and variable wireless environments. Before its formalization in performance requirements, system designers relied on qualitative assessments or simpler metrics like bit error rate (BER), which did not fully capture the 'detection' event critical for initial access and control signaling. The need arose for a probability-based metric that directly relates to system availability and access latency.
It addresses the problem of ensuring consistent and predictable network performance at the edge of coverage or in challenging radio conditions. By setting standardized DMP requirements, 3GPP ensures that all compliant UEs and base stations have a minimum detection capability, which is essential for mobility (handover success), cell discovery in dense or heterogeneous networks, and the operation of advanced features like beamforming in 5G mmWave, where directional signals are more susceptible to misalignment and blockage. Historically, as systems evolved from LTE to 5G NR with its complex beam management and wider bandwidths, accurately specifying and testing DMP became even more critical to guarantee the performance gains promised by new physical layer technologies.
Key Features
- Quantifies the probability of a receiver failing to detect a known transmitted signal when it is present.
- Central to defining RF conformance test requirements for UE and base station receivers in 3GPP specifications.
- Applied to critical physical layer signals including synchronization signals (PSS/SSS), reference signals (CSI-RS, SRS), and control channel elements.
- Evaluation is performed under standardized channel models (e.g., AWGN, fading profiles like TDL, CDL) and SNR conditions.
- Directly impacts higher-layer performance metrics like cell search time, handover failure rate, and beam failure recovery time.
- Used in link-level simulations and system design to optimize signal design, receiver algorithms, and transmission parameters for robustness.
Evolution Across Releases
Formalized detection performance metrics, including Detection Miss Probability, within LTE-Advanced specifications for enhanced receiver requirements and performance testing, particularly for new reference signals and in support of carrier aggregation.
Introduced comprehensive DMP requirements for 5G NR physical layer signals as part of the first NR specification set. Defined DMP for NR synchronization signal blocks (SSBs) for initial access, and for CSI-RS used in beam management and tracking in both sub-6 GHz and mmWave bands.
Extended DMP specifications for Integrated Access and Backhaul (IAB), defining detection requirements for discovery signals in backhaul links. Enhanced requirements for ultra-reliable low-latency communication (URLLC) scenarios, demanding lower DMP for control channels.
Defining Specifications
| Specification | Title |
|---|---|
| TS 36.902 | 3GPP TR 36.902 |
| TS 37.544 | 3GPP TR 37.544 |
| TS 38.151 | 3GPP TR 38.151 |
| TS 38.551 | 3GPP TR 38.551 |
| TS 38.761 | 3GPP TR 38.761 |
| TS 38.762 | 3GPP TR 38.762 |
| TS 38.827 | 3GPP TR 38.827 |