LIDAR

Light Detection and Ranging

Other
Introduced in Rel-16
A remote sensing technology that uses pulsed laser light to measure distances and create precise, high-resolution 3D maps of the environment. In the 3GPP context, it is studied as a key sensor for Connected Automated Vehicles (CAVs) and Advanced Driving Assistance Systems (ADAS), providing critical data for object detection, localization, and navigation.

Description

Light Detection and Ranging (LIDAR) is an active optical sensing technology that measures distance by illuminating a target with laser light and analyzing the reflected signal. A typical LIDAR system consists of a laser emitter, a sensitive photodetector (receiver), and a precision scanning mechanism (often rotating or solid-state). It operates by emitting short, focused pulses of infrared light. The time delay between the transmitted pulse and the detection of its reflection (time-of-flight) is measured with extreme accuracy. Since the speed of light is constant, this time-of-flight measurement directly calculates the distance to the object that caused the reflection.

In automotive and CAV applications, LIDAR sensors are mounted on vehicles and perform rapid 360-degree scans of the surrounding environment. Each laser pulse that returns generates a single data point with X, Y, Z coordinates (a 'point'). Millions of these points are collected per second, forming a detailed 3D representation known as a 'point cloud.' This point cloud data is then processed by onboard computers using sophisticated perception algorithms to identify and classify objects (e.g., other vehicles, pedestrians, cyclists, curbs, traffic signs), estimate their speed and trajectory, and model the drivable path.

3GPP's role, as outlined in technical reports like TR 26.928, is to standardize how this massive, high-bandwidth, and latency-sensitive LIDAR data (and data from other sensors like cameras and radar) can be shared between vehicles (V2V), with infrastructure (V2I), and with the network (V2N). This involves defining requirements for data formats, compression techniques, and Quality of Service (QoS) parameters for vehicular communication links. The goal is to enable 'collective perception' or 'sensor sharing,' where a vehicle can receive processed or raw LIDAR data from nearby entities, effectively extending its perception horizon beyond the line-of-sight of its own sensors, which is crucial for safety and high-level automation.

Purpose & Motivation

LIDAR technology was integrated into 3GPP studies to address the sensor and data sharing requirements for high-level vehicle automation (SAE Levels 4-5). While cameras and radar are also essential, LIDAR provides unique capabilities: it generates precise, high-resolution 3D maps regardless of lighting conditions (functioning in darkness) and provides direct, accurate range and velocity measurements. The motivation for standardizing its integration with cellular networks (C-V2X) stems from the limitations of purely onboard sensing. An individual vehicle's sensors have a limited field of view and range, creating occlusions and blind spots, especially in complex urban scenarios.

3GPP's work on LIDAR data aims to solve the problem of 'environmental perception sharing.' By using low-latency, high-reliability 5G NR sidelink (PC5) and Uu interfaces, vehicles and roadside units can exchange raw or processed LIDAR point clouds. This allows a vehicle to 'see' around corners or through other vehicles, dramatically improving situational awareness and safety. Standardizing this exchange is critical for interoperability between vehicles from different manufacturers and for creating scalable infrastructure-based sensing services. It transforms the vehicle from an isolated sensing island into a node in a cooperative intelligent transport system (C-ITS), which is a foundational step towards fully autonomous driving.

Key Features

  • Generates high-density 3D point clouds for precise environmental modeling
  • Provides accurate distance and velocity measurements via time-of-flight calculation
  • Operates effectively in varied lighting conditions, including total darkness
  • High angular resolution for detailed object detection and classification
  • Key input for simultaneous localization and mapping (SLAM) algorithms
  • Data subject to 3GPP standardization for V2X sharing and compression

Evolution Across Releases

Rel-16 Initial

Initially studied in the context of enhanced V2X services. Technical reports (e.g., TR 26.928) began analyzing requirements for sharing sensor data like LIDAR point clouds over 5G NR. Focused on defining use cases, data characteristics (volume, update rates), and preliminary QoS needs for cooperative perception in automated driving.

Defining Specifications

SpecificationTitle
TS 26.928 3GPP TS 26.928
TS 26.985 3GPP TS 26.985
TS 26.998 3GPP TS 26.998