Description
Within 3GPP specifications, Head Mounted Display (HMD) refers to a category of user equipment and the associated service requirements for delivering immersive media experiences, such as Virtual Reality (VR), Augmented Reality (AR), and Extended Reality (XR), over cellular networks. It is covered across numerous technical specifications (e.g., TS 26.114, 26.118, 38.835) that define codecs, transport protocols, system architectures, and performance metrics tailored for HMDs. An HMD is a wearable device with one or two high-resolution screens placed close to the user's eyes, often equipped with sensors for head tracking.
From a network architecture perspective, supporting HMDs involves enhancements across the service layer, core network, and radio access network. Key components include the media application server (e.g., for 360-degree video), the 5G Core Network with support for edge computing (MEC), and the Radio Access Network (RAN) capable of providing high throughput and ultra-reliable low-latency communication (URLLC). The media delivery often uses Dynamic Adaptive Streaming over HTTP (DASH) or similar protocols, extended with features like viewport-dependent streaming, where only the portion of a 360-degree video currently in the user's field of view is delivered at high quality to save bandwidth.
How it works involves a tight interaction between the HMD device, the network, and the media server. The HMD's sensors continuously report head orientation. This data is sent to the application server, often via a low-latency connection facilitated by network edge processing. The server then adapts the video stream in real-time, fetching and delivering high-quality tiles for the user's current viewport while delivering lower quality for the periphery. The network must guarantee the necessary bandwidth, latency (often below 20ms for motion-to-photon), and reliability to prevent cybersickness and ensure immersion. 3GPP's role is to standardize the interfaces, codec profiles (like for video-based point clouds), QoS mechanisms, and device capabilities to make this ecosystem interoperable and scalable.
Purpose & Motivation
The formalization of Head Mounted Display (HMD) requirements within 3GPP was driven by the rapid emergence of immersive media as a key use case for 5G and beyond networks. Prior to this focus, mobile networks were optimized for traditional video streaming to smartphones, which have different constraints regarding latency, field of view, and interaction. Early VR/AR systems were tethered to powerful PCs or used offline content, severely limiting mobility and mass-market adoption.
The limitations of pre-5G networks for HMDs included insufficient peak data rates for high-resolution 360-degree video, high latency causing motion sickness, and a lack of standardized methods for viewport-adaptive streaming leading to inefficient bandwidth usage. 3GPP began addressing these in Release 14 and expanded significantly in later releases to solve these problems. The purpose is to enable a high-quality, wireless, and mobile XR experience, which requires solving unique technical challenges around ultra-high throughput, very low end-to-end latency, and power-efficient device operation.
Its creation was motivated by the vision of XR as a transformative service for entertainment, education, industry, and social interaction. Standardizing HMD support ensures that content providers, device manufacturers, and network operators have a common technical foundation. This accelerates innovation, ensures interoperability, and allows for the economies of scale needed to bring immersive wireless experiences to consumers and enterprises, fulfilling key 5G performance promises.
Key Features
- Support for viewport-adaptive streaming of 360-degree and immersive video
- Standardized media formats and codecs for 3D graphics and point clouds (e.g., V-PCC)
- Network requirements for low latency (URLLC) and high throughput (eMBB) for XR traffic
- Power efficiency considerations and models for wearable HMD devices
- Integration with 5G System features like Network Slicing and Mobile Edge Computing (MEC)
- Defined Quality of Experience (QoE) metrics and reporting for immersive services
Evolution Across Releases
Introduced initial study items and requirements for VR services over 3GPP systems, focusing on 360-degree video streaming. This release laid the groundwork by identifying key challenges like bandwidth, latency, and QoE metrics specific to HMD-based media consumption.
Defining Specifications
| Specification | Title |
|---|---|
| TS 26.114 | 3GPP TS 26.114 |
| TS 26.118 | 3GPP TS 26.118 |
| TS 26.119 | 3GPP TS 26.119 |
| TS 26.238 | 3GPP TS 26.238 |
| TS 26.841 | 3GPP TS 26.841 |
| TS 26.854 | 3GPP TS 26.854 |
| TS 26.862 | 3GPP TS 26.862 |
| TS 26.865 | 3GPP TS 26.865 |
| TS 26.918 | 3GPP TS 26.918 |
| TS 26.928 | 3GPP TS 26.928 |
| TS 26.929 | 3GPP TS 26.929 |
| TS 26.955 | 3GPP TS 26.955 |
| TS 26.956 | 3GPP TS 26.956 |
| TS 26.962 | 3GPP TS 26.962 |
| TS 26.998 | 3GPP TS 26.998 |
| TS 26.999 | 3GPP TS 26.999 |
| TS 38.835 | 3GPP TR 38.835 |