Skip navigation
Like what you’re reading?

Four moves to drive operator growth in the AI-native era

  • Real-time AI creates new traffic patterns that require networks to evolve into an active intelligent fabric.
  • The path to the AI era runs through a stepwise evolution from 5G Standalone to 6G. Four key moves for operators include maturing 5G SA, applying autonomy safely, building a trustworthy data foundation, and placing AI across the network.

Many conversations about AI in telecom focus on how much traffic it will generate, but the real opportunity in the AI era will be behavioral.

We are adding a new traffic category complementing a network optimized for human consumption – featuring heavy downlink, bursty traffic, and high latency tolerance – with one that must support real-time, interactive AI. Multimodal assistants, live enterprise guidance systems, and autonomous agents do not just consume data but interact continuously between devices and the cloud. They create traffic patterns that are more uplink-heavy, more latency-sensitive, and more session-rich than traditional mobile broadband was designed to handle.

For operators, this means "best effort" connectivity is rapidly becoming a bottleneck on enterprise productivity, user experience and revenue. The network is no longer just transporting data but is an active enabler of distributed AI inference and learning, that is, an intelligent fabric.

The instinct may be to wait to solve this. However, preparing for the AI era is not a rip-and-replace exercise reserved for the 2030s. It is a stepwise evolution from 5G Standalone to AI-powered 5G to 6G, and the critical window to act is now.

What AI-native means for a network

AI-native is the concept of having intrinsic, trustworthy AI capabilities as a natural part of how the network functions in design, deployment, operation and maintenance. In an AI-native network, AI is embedded across the RAN, Core, Transport and OSS/BSS. It is not bolted on or isolated as a separate AI layer.

Two points are worth stating clearly. First, AI-native is not one model running everything. Different tasks require different AI technologies. This ranges from specialized network models to small and large language models. Second, AI-native is not "edge everywhere." The guiding principle is the right AI in the right place, based on latency, data access, cost, and sustainability.

This two-sided framing captures why it is strategically significant. AI for networks is using AI to make the network smarter and more efficient to run through learning loops that improve energy efficiency, detect anomalies, and enable increasingly autonomous operations.

On the other hand, networks for AI means evolving connectivity from passive transport into an active enabler of distributed AI, including providing the uplink capacity, low latency and reliability that modern AI systems depend on. Both sides create measurable business value.

Four moves for the coming years

Move 1 — Use 5G Standalone as the start for AI-native evolution

AI-native is an evolution, not a reset. The mobile system will be progressively enhanced with AI functionality that optimizes existing capabilities and enables new ones. 5G Standalone provides the baseline architecture and operational models to introduce AI functionality over time.

The near-term priority is to raise 5G SA maturity so the network is ready for AI-driven traffic and new differentiated offerings. A mature SA foundation strengthens policy control, supports API exposure, and enables the service models that enterprise and developer ecosystems increasingly expect. Without it, the moves that follow are significantly harder to execute.

Move 2 — Introduce autonomy cross domains aiming for level 4

Autonomous networks use intent-based control, agentic orchestration, real-time observability, and closed-loop AI to allow networks to optimize themselves. For operators, the question isn't "How autonomous can we be?" It is "How efficient and trustworthy is our autonomous network?"

Specialized AI agents work together across network domains, including RAN, transport, core, and OSS/BSS, while staying within shared boundaries and clear policies. These agents jointly handle network optimizations, apply changes, and roll back on KPI regressions. Network digital twins provide a safe environment to test actions before execution and compare predicted and actual outcomes. Explicit guardrails keep the network safe using policy constraints, approval tiers for high-impact actions, and continuous KPI verification.

Move 3 — Build a data foundation that AI can use

Many AI initiatives fail because the underlying data foundation is not strong enough. With 5G, data is collected use-case by use-case, which creates fragmented pipelines that don't scale. AI-native systems need data that is consistent, real-time, and trustworthy enough to be reused across the network.

Three things matter most:

  • Shared meaning. AI agents need to interpret data the same way across different network domains. Without that, they make wrong assumptions and wrong decisions.
  • Full visibility. When hundreds of agents are acting across the network, you need to see in real time exactly what happened, why, and what effect it had.
  • Governance and security built in. Policy checks, privacy controls, and clear decision trails need to be part of every data pipeline from the start. If data moves across organizational boundaries, then security and compliance have to move with it.

Move 4 — Place AI across the network

Running AI workloads and network workloads on the same compute resources can introduce contention and latency risk. An AI-native architecture addresses this directly. Distributed inference at the RAN node handles the tasks that demand microsecond-level response. Regional or centralized AI clusters take on the compute-intensive workloads that do not require local processing.

This division balances performance, total cost of ownership, and energy efficiency, while avoiding the trap of defaulting to either "everything at the far edge" or "everything in the cloud."

The four moves are the on-ramp to 6G

6G will be AI-native by design. Work on the first 6G standard is already underway, with pre-commercial trials expected before 2030 with first commercial services when the ecosystem matures 1-2 years later. The upgrade to 6G will enhance a range of capabilities, including AI, which will be integrated into the air interface, scheduling, multi-band system design, and autonomous operations from the start.

But the most important preparation for 6G is not waiting for new standards. Operators that have matured their 5G Standalone foundation, built safe and auditable autonomy, established a unified data foundation, and learned to place AI intelligently will be best positioned to adopt 6G capabilities quickly and cost-efficiently. The four moves above are the practical response to AI-driven demand today — and the clearest on-ramp to the AI-native network of the 2030s.

Where the revenue shows up first

AI-native architecture helps operators reduce operating costs and energy consumption, and it improves network performance and user experience. But the business question is straightforward. Where does new revenue come from?

Three near-term opportunities stand out.

Network exposure APIs allow operators to make connectivity capabilities – such as location, quality of service, routing, and more – programmable and purchasable by developers and enterprises. AI will help in optimizing performance and the accessibility of the network APIs, as well as perhaps providing more advanced dynamic monetization schemes.

Differentiated connectivity creates premium tiers for AI applications that require guaranteed uplink performance, low latency, and reliability beyond standard mobile broadband.

And enterprise edge offers target industries where data sovereignty, processing speed, and compliance create willingness to pay for managed, high-performance compute close to the source. Together, these plays turn network capabilities into a platform, and that is the commercial logic of AI-native investment.

The AI-native journey starts now

Becoming AI-native and building the intelligent fabric is a stepwise journey. The first step is to build on 5G Standalone with upgrades to AI-powered 5G, applying autonomy safely, placing AI intelligently and treating data and governance as first-class assets. The four moves above are not sequential. They are mutually reinforcing, and progress on each accelerates the others.

For industry leaders, the priority is to assess readiness across architecture, operations, and data foundations then pick a small number of high-impact use cases, deliver measurable results, and scale from there. The AI era will reward networks that are not just fast, but trustworthy, adaptive, and purposefully designed for what AI demands.

Read more

Read Erik Ekudden’s first post in this series: AI’s future will be defined by the intelligent fabric

Telecom AI - Networks for AI and AI for networks

Physical AI needs a nervous system

How AI-powered devices will drive the shift to uplink-heavy networks

The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.