Skip navigation
Like what you’re reading?

What is the metaverse and why does it need 5G to succeed? The metaverse 5G relationship explained

Available in English Français 简体中文 繁體中文
It’s hard to avoid the term metaverse these days. But the truth is, to reach its full potential, the 5G metaverse relationship will be key. We explore how 5G can enable exciting uses like metaverse VR, metaverse XR, and help make Web 3.0 a reality.

Ecosystem Co-creation Director at Ericsson

VP Emerging Technologies

Director of Innovation Engagements at D-15 Labs at Ericsson’s Technology Office Silicon Valley

Director Strategy Execution at Ericsson Group Function Technology

What is the metaverse and why does it need 5G to succeed? The metaverse 5G relationship explained

Ecosystem Co-creation Director at Ericsson

VP Emerging Technologies

Director of Innovation Engagements at D-15 Labs at Ericsson’s Technology Office Silicon Valley

Director Strategy Execution at Ericsson Group Function Technology

Ecosystem Co-creation Director at Ericsson

Contributor (+3)

VP Emerging Technologies

Director of Innovation Engagements at D-15 Labs at Ericsson’s Technology Office Silicon Valley

Director Strategy Execution at Ericsson Group Function Technology


Chances are you have been in a metaverse already! Minecraft, anybody? Or, Fortnite? What about Pokémon Go or Roblox? The real metaverse afficionados reading this blog post will have been frequenting Decentraland, maybe even hosting their own NFT art exhibition there.

That’s a lot of new terminology in our opening paragraph. Let’s dive in and try to understand what this all means and – most importantly – how it relates to another emerging technology: 5G. Indeed, we believe that 5G is a critical enabler for the metaverse and its device and application developer ecosystem.

What is the metaverse?

The term metaverse was coined by Neal Stephenson in his 1992 novel “Snow Crash”. It remained buried deep under the snow for several decades, until Facebook announced a virtual reality (VR) powered metaverse to be the next big thing. So big, that this once trillion-dollar company rebranded to Meta. The internet, device and connectivity ecosystems have been in frenzy ever since.

Metaverse platforms and content providers

Figure 1: A non-exhaustive snapshot of the Metaverse ecosystem, including device OEMs, connectivity providers, cloud infrastructure and platform providers, Metaverse platforms and content providers.

The concept of metaverse does not belong to Meta, of course. It thus means different things depending on who you ask. We could share formal definitions here, but would rather concentrate on the three important elements each of these definitions embrace.

First, and most importantly, the metaverse embraces a social element. It is not only a virtual space where users spend time (and money) on their own or with a selected few. Rather, the metaverse is intended to resonate with the very social fabric which underpins human society. Once in the metaverse, you and/or your avatar are able to interact humanly by looking into each other’s eyes, perceive body language and maybe even shake hands or hug each other.

Second, it has a strong virtual narrative. For some, the metaverse exists in a purely virtual world which can be consumed by us through VR headsets; an example here is the game Fortnite played with metaverse VR interaction using such headsets. For others, it has a strong foundation in the physical world but with digital overlays experienced through augmented reality (AR) or the more interactive mixed reality (MR). An example here is Pokémon Go played through a mobile phone or AR glasses. Either way, our experiences and ways of social interaction are significantly augmented with persistent virtual content. Access to the virtual world of the metaverse and haptic interaction therein is enabled by any of these 3D eXtended Reality (XR) devices, and in the interim via today’s 2D screens leveraging WebXR technologies.

Third, it is accelerated through novel technologies, like Web 3.0, blockchains, non-fungible tokens (NFTs), 5G, digital twins, artificial intelligence and XR devices, just to name a few. It is important to understand that the metaverse could probably exist without most of these tech ingredients, but uptake and scale would be seriously hampered. We will give a few examples further down, once we have discussed the building blocks of the metaverse in more detail.

Our authors’ avatars (from top left): Yashar Nezami, Mischa Dohler, Meral Shirazipour, and Eric Blomquist.

Our authors’ avatars (from top left): Yashar Nezami, Mischa Dohler, Meral Shirazipour, and Eric Blomquist.

The metaverse and the emerging Web 3.0

Let’s deep dive into a technology which is frequently cited in the context of the metaverse: Web 3.0. The term is heavily overloaded/ overused but in essence symbolizes the emergence of a new decentralized web leveraging blockchain technologies.

Why is that important? It is a question of ownership. Current internet applications, such as social media platforms, are centrally owned. Any financial transactions or any code upgrades or any decisions to close it down are thus under the total control of the owning company.

Web 3.0, on the other hand, will have a decentralized ownership structure thanks to the decentralized properties of blockchains. This new modus operandi poses serious challenges in terms of operational and energy efficiencies but offers unique opportunities by making users a central part of this new internet and its economy. As will be exemplified below, it also offers interoperability between applications in the same way as IP offers interoperability between networks and devices.

Decentralized Web 3.0 equivalents to the centralized Web 2.0 are emerging quickly: Filecoin or IFPS are equivalents to Dropbox; Brave to Chrome; Metamask to Paypal; DTube to YouTube, and so on. While Web 2.0 apps are powered by operating systems like Windows or macOS, the Web 3.0 apps run on decentralized operating systems like Ethereum.

A Web 3.0 operating system is also known as the infrastructure layer. It enables distributed applications, or dApps. Ethereum is the most popular but not the only one; other such blockchains are Solana, Polygon, Tron, Cardano and EOS.

These infrastructure blockchains can run one or several value tokens. For instance, the Bitcoin blockchain would only support one value token, the Bitcoin itself. Ethereum, however, allows several tokens to run on top, each with its own value ecosystem. Examples of metaverse-related value tokens running on Ethereum are SAND, MANA, AXS and GALA, all of which allow you to buy/sell items in the virtual world at a perceived value. This layer is often referred to as the token layer.

Moving into the metaverse, these tokens allow you to buy and sell fixed as well as dynamic assets, which is why we refer to this layer as the asset layer. For instance, in the metaverse platform Decentraland you can use the MANA token to buy, sell or rent land. Or you can bring digital art which you purchased on the platform OpenSea into your land owned in Decentraland; this is possible because both Decentraland and Opensea are interoperable since they are running on the same operating system, Ethereum.

Putting the above Web 3.0 constituents together, we observe that a new OSI-equivalent layer is emerging which runs on top of our traditional networking technologies. To enable an efficient execution of this emerging OSI stack, the underlying networking technologies must operate seamlessly – which is why the emergence of limitless connectivity via 5G is so timely!

Metaverse VR and AR devices

Before talking about 5G and networks, let’s discuss devices. Think of them as proxies or portals between the physical world and the metaverse. Today we have keyboards and touchscreens that all require interactions that need to be learned at some point. From a user experience point of view, however, the emerging metaverse devices yield a unique opportunity for more intuitive interactions and consumption of digital content and information.

As proxies, these devices need to translate information from the physical world into the virtual, but also back from the virtual world into the physical.

The former – sensing the physical environment – is done through an exploding ecosystem of sensors which in their entirety form the Internet of Things (IoT). In the context of the metaverse, the IoT relies on Lidar sensors, cameras, volumetric capture devices, haptic suits and gloves, neural wristbands, or even Neuralink-like devices.

The latter – the ability for us to consume the virtual metaverse content – is enabled by an exploding ecosystem of VR, AR, MR (which, together with haptics and other sensory interaction, is sometimes grouped into XR) as well as holographic projection devices.

extended reality

Figure 2: The wide spectrum of different “realities”, including virtual reality, mixed reality, augmented reality and their uber-term, extended reality.

VR refers to spatially isolated computer-generated simulations of three-dimensional environments that can be interacted with in real-time through head-mounted displays (HMD) and game controllers. VR devices have enjoyed a solid growth in both enterprise and consumer segments with popular products being Oculus Quest 2, Varjo VR-3, Playstation VR, Valve Index and HP Reverb G2.

AR, by contrast, provides a composite view between physical and virtual worlds by superimposing a computer-generated image on a user's view of the real world. Popular AR gear is your mobile phone running apps providing AR filters as found in Instagram, Snapchat and TikTok. Purpose-made AR devices are HoloLens 2, Lenovo’s ThinkReality and Nreal. There is even Mojo Vision and InWith, which work on AR contact lenses – the future is nearly here!

MR provides a virtual overlay onto the physical world along with real-time interaction. We can even imagine a future where the virtual world would be able to “reprogram” the physical world by use of actuators. A vision which has been laid out by Ericsson’s 6G research team designing reprogrammable worlds – the cyber-physical continuum.

AR and MR require spatial persistence, meaning that if a user moves in the physical world, the virtual overlay should be anchored in the real world. For instance, if an AR/MR user walks away from a physical table on which a digital vase with flowers is placed, these ought to get smaller with increasing physical distance between the user and the table.

VR emerged first because it can be achieved by rendering environments in a controlled manner with limited compute power. Compute capabilities and optics have evolved, however, and AR/MR is catching up quickly. These technologies are the closest we have today to supporting the social engagement element of the metaverse.

Last, but not least, holographic projection has been gaining traction in recent years – though it is not yet clear if it will persist as a technology. Traditional VR and AR renders a 3D-world onto a 2D surface which is being viewed by the user; more advanced stereoscopic AR gives a holographic-like experience. True holography recreates 3D-worlds by utilizing phase differences in light; images are much more crisp and perceived to be truly 3D, but generation of consumer-grade phase-coherent light has proven difficult so far.

All of the devices mentioned above have one important requirement in common to achieve device desirability in terms of comfort and weight at reasonable cost: they require performant, reliable and secure networks. The goal is to offload as much of control and compute tasks possible away from the devices onto the edge. The devices need to be connected to the lowest possible latency, as well as an edge server where, for example, the graphics are rendered in real-time and then streamed to the HMD like in a video conference.

While the device ecosystem is still at its infancy, the XR ecosystem is already moving in this direction with technologies such as Boundless XR and CloudXR. These emerging multi-access edge computing (MEC) capabilities make it possible to offer much more immersive experiences: VR headsets can show content at a much higher level of detail (LOD) and AR headsets can handle much more complex real-world interactions. However, this comes at a price.

Notably, data needs to be sent back and forth between AR/VR devices and the edge cloud within milliseconds, at (almost) bounded latency and at a high data rate.  A reliable, secure and low-latency wireless connection to XR devices is thus paramount. The only wireless technology proven to achieve such limitless connectivity today is 5G. It’s time to talk 5G!

 

Networking requirements: supporting the Metaverse with 5G

In general, 5G and telecoms have come a long way. What used to be purely about connectivity has evolved over the years into a vibrant ecosystem composed of vendors, service providers, device OEMs, cloud hyperscalers and application developers. This is because wireless is both complex and exciting.

In a Discover Ericsson podcast, listen to Mischa Dohler, Chief Architect at Ericsson Silicon Valley, talk about why 5G is playing a central role in the emerging metaverse.

In the context of the metaverse, many features and requirements which go beyond pure connectivity need to be addressed. We have summarized a non-exhaustive list in Table 1, comprising ubiquitous access, accessible XR devices, edge-cloud capabilities, pertinent standards, and ease-of-use for the developer community. Let’s discuss these now in more detail.

Metaverse Features & Needs

How 5G & Ecosystem Addresses These Needs

Ubiquitous access to all multi-verses that form the metaverse. Lightweight and accessible XR devices to experience the metaverse
  • Consistent coverage and capacity, as well as mobility support
  • Seamless vertical handovers, e.g. to/from WiFi
  • Global reach, including roaming capabilities 
Lightweight and accessible XR devices to experience the metaverse
  • Low latency and reliable communication to enable devices to offload more to the edge and leverage edge rendering / edge streaming
  • Access to edge compute with high throughputs and low latency
Cloud and Edge-Cloud (MEC) capabilities
  • Low latency
  • Offload processing to save battery life
  • Enhanced render level of detail (LOD) 
Standardized interfaces
  • Telco standards
  • Haptic,holographic and XR standards
  • Metaverse standards
Easy Access to Communication Services for Developers
  • APIs to give developers access to 5G as a "network platform" Developers
  • Easy bind in of APIs/SDKs into available developer platforms
  • Easy bind in of APIs into business logic

In terms of ubiquitous access, there are numerous wireless connectivity technologies today: the most popular being Bluetooth, Wi-Fi and cellular technologies. Bluetooth lacks range, rate and reliability. Current generations of Wi-Fi offer the required rate but suffer from congestion and thus high latencies once several XR devices are connected simultaneously; Wi-FI 7 promises to address the congestion issue but is nowhere near the range and global coverage offered by cellular technologies. Also, it does not offer any Service Level Agreements (SLAs) that can be provided using 5G’s emerging slicing concepts which is vital for many enterprise applications.

5G offers rate, range, reliability, latency and so much more. Indeed, the average DL/UL data rates provided by 5G today are 200Mbps/30Mbps per user. Depending on the choice of scheduler, radio bearer configurations and radio conditions, the achievable latencies are in the order of 10ms for frequency range 1 (FR1, i.e., below 6GHz) and 5ms for FR2 (above 24GHz, i.e., mm-wave bands). Reliability can today be in the order of 99.99 percent with five to six nines of reliability achievable over the coming years.

Does that suffice for XR?

Let’s see: if a user moves his/her head using VR, the new immersive visuals need to be projected within 20ms (ideally below 10ms) to avoid motion sickness. In AR, less than 30ms is required to ensure that virtual objects appear spatially anchored in the environment for a single-user experience, and significantly less than that for a multi-user AR experience. On-device smart processing techniques, such as asynchronous time wrap (ATW) that reuses old content with the new head position, help to relax these latency requirements by a factor 1.5-3x.

Regarding the required data rates to facilitate edge-cloud XR processing, we differentiate between three scenarios ­– low, medium and high offload for XR:

  • In pure VR, the optimal target is to render most if not all content in the edge-cloud. Such a high offload scenario requires download (DL) rates which are proportional to the resolution of the rendered environment. Per a recent GSMA study, the rates are 30Mbps for a 2K H.264-encoded stream and up-to 800Mbps for a 8K H.266 encoded stream. The uplink (UL) rates are insignificant, i.e., well below 2Mbps, as only HMD orientation and some other user-generated control via for example haptic gloves need to be transmitted.

  • In AR, different spatially-aware tasks need to be completed by the system, thus giving the opportunity to invoke three offload scenarios that are summarized in Figure 3. The DL rates range from 20-80Mbps and the UL rates from 10-40Mbps, depending on which tasks are offloaded to the edge-cloud.


In terms of latency requirements, the GSMA study differentiates between different degrees of XR interactions: weak interactions (like broadcasts) have a generous end-to-end latency budget of 10-20s; moderate interactions (like an XR video conference) require 200ms; and strong interactions (online games or engaging sports games) are ideally delivered at less than 20ms.

AR compute task execution

Figure 3: Illustration of the low, medium and high offload scenarios between AR devices and an edge-cloud for a typical AR compute task execution (modified from source).

In terms of lightweight and accessible XR devices, this is achieved as more tasks are being offloaded to the edge-cloud. Indeed, the more is offloaded, the lower the requirements on processing capabilities and energy storage. Both help with the form factor, with the experienced weight and therefore the overall consumer price. 

First measurements indicate that low offload reduces device energy consumption threefold; mid offload fourfold; and high offload by more than sevenfold. This is a tangible reduction and directly translates into an improved user experience.

Edge-cloud support is vital to scale XR and is therefore one of the fundamental technologies to enable the Metaverse by making XR devices economically affordable, lightweight yet powerful, and connected with sufficient battery lifetime. A challenge with edge-cloud is how much can truly be offloaded to the edge, at the same time maintain the KPIs for the applications, and deliver an acceptable QoE to the end users regardless of if they are consumers playing games or enterprise users delivering the sophisticated next product in the metaverse. As shown in Figure 4, this means we should expect to see more edge-cloud type solutions in operator networks around the world where content gets closer to the RAN by using a UPF with local breakout.

API for application programming interface.

Figure 4: Technology constituents underpinning an end-to-end metaverse link using XR devices and mobile edge cloud to support split-rendering of high-LOD graphics. Here, OS stands for operating system; SW for software; HW for hardware; gNB for next generation node B; TN for transport network; CN for core network; UPF for user policy function; NW for network; and API for application programming interface.

Standards and standardized interfaces will ensure interoperability within this increasingly complex metaverse ecosystem. Having the metaverse run on a common blockchain operating system, such as Ethereum, does not suffice. Interoperability is required across blockchain families, across physical and virtual worlds, and across other important technologies underpinning the metaverse, such as haptic devices.

Indeed, the IEEE Haptic Codecs standardization group P1918.1.1 develops perceptual codecs for both kinesthetic (muscle movement) and tactile (touch) signals. Often referred to as the “MPEG of touch”, they are developing proposals for tactile codec technology through common hardware and software reference designs. This is important to avoid haptic vendor lock-in which in the long-term will enable scale of the haptic device deployments.

Along the same line, novel holographic and XR standards are needed in order to avoid a vendor lock-in long-term and ensure interoperability. Various standards initiatives deal with such standards today, such as MPEG (ISO/IEC), 3GPP, ETSI ARF, VR-IF, OpenXR and Open AR Cloud. We will dedicate a future blog to a deep dive on all emerging standards which are related to the Metaverse.

The most important challenge, however, is to ensure interoperability between virtual worlds. This transition from today’s multiple metaverses (a.k.a. multi-verses) to a / the metaverse is akin to the transition from early local area networks (LANs) to today’s internet.

In terms of easy developer access, it is important to ensure that large content developer communities around the world are able to easily integrate advanced XR capabilities into consumer and enterprise applications. It requires 5G-native APIs to be offered to the developer community, and ideally be embedded into SDKs of specific platforms. These APIs will help developers improve the quality of experience of their XR applications.

One example is the API for network slicing. Developers want intuitive APIs that feel familiar to what they have used in the past. Although much functionality can be offered as a third-party solution, features that access the 5G modem may require changes to the underlying operating system which would require involvement of the device platform providers. There is an opportunity for the broader ecosystem to work together to ensure the best APIs possible. To be successful, it is also important to gather early feedback from the developer community.

Discover photorealistic 3D conversations

Find out how Ericsson ONE is making it possible to be in two places at once.

Open challenges

While the above are exciting developments, important technological challenges still need to be solved before the metaverse can reach prime time. One issue stands out: privacy. Indeed, while many privacy challenges are being addressed in telecom standards, we have not solved privacy at application level. Imagine the challenges ahead of us when it comes to privacy in the metaverse. Who will protect our children from harmful content or harmful immersive experiences? Who will ensure that your physical identity behind your avatar in the metaverse is protected at all times? Who will ensure that we won’t become the by-product of endless advertising?

Many non-technical issues also remain. For instance, the excitement around the metaverse does not match the pace of the underpinning standards; it is not only the communications standards that are important, but we urgently need haptic/XR/codec and metaverse interoperability standards. Another issue is around the lack of metaverse content, which typically goes hand in hand with a sustainable ecosystem consistently providing avatars, NFTs, educational content, and so on. Other issues persist around business models, regulation, and net neutrality.

Last but not least, if we believe the metaverse will reflect an augmented social fabric in a virtual world, we should sculpt it according to our ethical beliefs. We ought to subscribe to some form of social norms and a decentralized body overseeing these norms. We ought to ensure security and uphold privacy. We really need to make sure that the metaverse cannot be weaponized for national or international conflicts. In simple terms, we really need to think this through and ought to avoid building a house of cards – the stakes are simply too high. 

Emerging metaverse applications and services

Independent of challenges above, coverage of 5G will pick up over the coming months, allowing XR afficionados to connect anytime and anywhere in the (physical) world to their (digital) metaverse of choice, and embrace exciting and novel applications and services.

Advanced network services will be needed to protect the metaverse XR data streams in an increasingly congested future. That is a unique opportunity for telco and service providers to create top-line business opportunities. For instance, telcos could charge for dedicated metaverse slices; or they could charge for the ability to offer location-based services; or they could power their own metaverse ecosystem.

Once operational, the underlying 5G infrastructure will power advanced consumer and enterprise metaverse use-cases. Imagine all of this to happen in your metaverse in 2025: you conduct your next business meeting where you will engage with peers in a fully immersive environment. After the meeting, you “walk out” and order a pizza within the very same metaverse using a crypto token. You could choose to eat it virtually, but more likely than not you will get the pizza delivered to where you are physically located.

How about running a business in the metaverse? It requires the underlying metaverse infrastructure to be up and running, economic enablers in place such as payment via cryptocurrencies, XR devices adopted and wireless networks fully functional to support metaverse visitors at your store at scale.

What about consumer applications in the metaverse? Will it be the next internet? While the entire ecosystem is still in its nascent form, what is clear is that gaming will act as an inception for many metaverse applications. Some argue that game developers are to the metaverse today what web developers were to the internet several decades back. Given that the global average daily in-app usage of games like Roblox and Minecraft is 10x that of Facebook and Twitter, it is difficult to argue to the contrary.

Concluding remarks on the metaverse 5G relationship

It is clear that the metaverse requires highly reliable, high throughput and bounded latency networks that are significantly more demanding than the current best effort services for mobile broadband. 5G is ready to deliver that but challenges remain in network densifications, spectrum availability, indoor/outdoor capacity increase and co-existence between mobile broadband (MBB), mission critical communications (MCC) and XR services in wide area networks.

Ericsson’s current 5G RAN portfolio is an important step towards realizing the metaverse. It is equipped with software toolboxes such as Time-Critical Communication on top of best-in-class hardware to provide unbeatable experience for bounded latency, highly reliable real-time services such as XR. Many of the feature are available today and many more will be introduced as we progress into further 3GPP releases.

Providing cutting-edge networks, however, is not enough. All ecosystem players need to come together and strategically contribute to a coherent R&D and standardization roadmap. Without such a tight cooperation, the Metaverse may not happen for years to come – or ever. Ericsson is playing its role in the ecosystem by enabling the metaverse over 5G and ultimately 6G networks.

Imagine possible - join us in designing the future!

imagine possible

 

Learn more

Ericsson’s Imagine Possible event will gather leading innovators and thought leaders from technology frontrunners and enterprises as they share their vision for the future. Learn more about the event on October 18-19 in Santa Clara and online.

Read more about how immersive technologies can create next-gen customer experiences.

Learn more about why 5G is key for XR to achieve its potential.

Read how 5G and Edge Computing can enhance virtual reality.

Hear how 5G is transforming the live sports experience.

The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.