Offloading for the future: current use cases and scenarios
- Moving compute heavy tasks away from a resource constrained device like a drone is called computational offloading.
- Computational offloading provides the flexibility for tasks to be run where they will perform most efficiently.
- Different offloading configurations is possible. Offloading can be tuned to meet the use case requirements, especially in terms of the degree of offloading, architecture, and scheduling.
How does an autonomous airborne delivery drone navigate through a city, avoiding both stationary obstacles and unpredictable hazards? To perform such functions without direct human control requires a great deal of complex, power hungry computation, including image recognition, path planning and environment mapping. Is it possible that such an array of tasks could be performed by a vehicle designed to be both cheap and lightweight? If one considers the characteristics and requirements of such a vehicle and compares it to a typical desktop computer, the obvious restriction when it comes to effectiveness is that of the weight that must be lifted into the air. The greater the processing capability the heavier the drone, and the lower the payload that can be carried. And this does not take into account the array of sensors, particularly cameras or LiDAR (light detection and ranging), for example, that would be needed to gather information for processing. So given that drones have limited battery and payload, how are they capable of performing the computational work that would normally require the resources of a data center in the cloud?
The answer to that question concerns both network connectivity and the Internet of Things (IoT). Stepping back from the drone, it’s possible to observe that there are a number of cameras attached to buildings and poles. Looking up the 5G antenna can be seen emerging from a small building in the distance. Whereas at one time a drone like this would need to keep all sensors and processing onboard, with a consequent burden in weight, financial cost, and power drain, now the resources available to the drone extend into the network, reducing the processing load on the device itself. The cameras capture images of the area and send them through the network to a local edge datacenter, that is located physically close to the drone for optimal performance, where the images are processed to build up a model of the environment and objects passing through it. The drone only needs to send its own positional and directional data, obtained using conventional GPS equipment, which is combined with a map built up from external cameras. This allows navigation software running in the edge to compute a safe path toward the delivery destination. Then, that information is sent back to the drone within a bounded time period to be translated into control instructions.
Moving compute heavy tasks away from a resource constrained vehicle like a drone allows these tasks to run where they will perform most efficiently, moving between edge locations as the drone moves, according to such factors as resource availability and network quality. It also allows a service provider the flexibility to define simple service level agreements (SLAs) according to required performance metrics as a way of informing the offloading framework when a change of deployment is needed. The fact that tasks are running remotely from the drone means that functionality can be adapted on the fly without recalling the device. But all this is only possible if the network responds fast enough and can carry enough data to support this increased traffic. This is why the prospect of computational offloading in real world services rather than tightly constrained environments such as factories or research laboratories has only started to become a reality with the introduction of 5G and the upcoming 6G.
It is anticipated that as network capability increases, more and more compute and sensory tasks will be migrated away from a vehicle, turning delivery drones into service-driven commodity items rather than specialist custom vehicles. The software driven application extends beyond the vehicle, into the edge and cloud with the network at the core.
Offloading use cases
Offloading is clearly beneficial in autonomous airborne delivery drone use cases. It is important to note that offloading can also be used in a variety of other use cases.
Some of the most promising use case categories include mine inspection through autonomous vehicles, mobile robots in factory facilities, and service robots in public environments - as shown in the first figure above. In these use cases, vehicles currently operate on non-public roads or in limited areas such as factory facilities. They usually move quite slowly, so there are no critical safety challenges such as the ones faced by autonomous vehicles on public roads. The main idea in such use cases is to offload much of the heavy computing, as this provides the possibility to reduce hardware cost and size, increase battery life and make maintenance easier. For driver-assistance services for passenger vehicles - also illustrated in the first figure above - offloading is only possible for infotainment applications due to safety requirements. As is the case with mobile robots, in autonomous farming machinery the main idea is to offload much of the heavy computing to provide the possibility to make cheaper machines. So tractors, for example, can then be manufactured more cheaply, and be used for longer periods of time as software on the edge or cloud will be updated instead of on the tractor itself.
The second family of use cases are the ones using extended reality (XR) technology. We have already discussed how device mobility is a crucial driving factor for offloading in the case of drones and similar self-navigating vehicles, but other categories of applications can also take advantage of the movement of computation away from a lightweight user device. A prominent example of this is the case of XR, in which the user device consists of a pair of glasses on which images are overlaid. This is generally a very compute intensive activity, and given that in order to be comfortable, glasses need to be as light as possible, any type of onboard compute resource, for example storage, processing capacity or high-capacity battery would cause discomfort to the wearer.
In the case of an XR device, onboard sensors would consist of a camera and potentially GPS and a gyroscope to measure orientation. The principal onboard tasks would consist of video streaming and decoding with any image rendering and game engine processing occurring in the network. Given how sensitive XR applications are to latency requirements to provide an acceptable user Quality of Experience (QoE), the most logical offloading site would be a nearby edge location.
Offloading is a flexible mechanism and therefore its implementation might vary depending on the use case and situation. Above, the offloading concept has been explained within two use cases, autonomous airborne delivery drone and XR.
As one ponders the huge variety of use cases, it would be logical to conclude that different offloading configurations should be possible, and that is indeed correct! Offloading can be tuned to meet the use case requirements, especially in terms of the degree of offloading, architecture, and scheduling.
The degree of offloading represents the portion of processes in the application that are offloaded to another machine. The different options are no offloading, partial offloading, and full offloading. Architecture refers to the nature and location of the available machines where processes can be offloaded, including edge, cloud, and both edge and cloud. Scheduling refers to the decision of when to offload. It can be done statically, which means the system executes using a predetermined configuration based on offline performance predictions and profiling. The scheduling can also be performed dynamically where the system has the capability of online observation and can decide on the fly which offloading configuration to run. This requires that the application has access to services that can inspect the changing environment on which both offloaded and non-offloaded processes are running and move processes to locations where they will run in a more optimal manner.
The importance of offloading in the various technologies that are present in our daily lives is clear. In addition, offloading can enhance the user experience and reduce costs related to the devices as a simplistic hardware will be enough on the user side. It is important to remember that to make all this possible, a stable communication that provides low latency and high throughput is needed. This means that communication requirements of a particular use case should be satisfied, otherwise the tasks should remain running on the device. Therefore, to achieve the full potential of offloading, advances in communication technology, such as 6G, will be fundamental.
Like what you’re reading? Please sign up for email updates on your favorite topics.Subscribe now
At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.