Telehaptic Drone Control (Feel the Force)

As a part of our ongoing explorations of remote controlling things, we recently played around a bit with remote controlled drones and haptic augmented reality. It turned into a nice prototype as well as a showcase for something that future network technologies like 5G will enable. We demonstrated the whole thing at the Mobile World Congress 2016, even live on stage with our CEO at his MWC16 Keynote.


Drones, from cheap plastic toys to advanced carbon fiber rigs carrying high quality cameras, are interesting because they are quickly becoming commonplace gadgetarian paraphernalia. The pro ones are already standard equipment in video production, news reporting and real estate photography and they are even used for visual inspections of hard-to-reach infrastructure such as power lines or mobile infrastructure. Parcel delivery is likely the next frontier. A bit further into the future and it is not unlikely that drones become capable of performing many more kinds of tasks, either autonomously or remotely operated by a person. The manoeuvrability has to be improved first though. It just isn't precise enough. (Not so much an issue if you're making a film, but critical if you try to get a drone to do painting, knitting or mounting a bolt or something).

In previous projects where we remote controlled an excavator, we discussed how we could use haptic feedback for guiding the control, so we gathered a team from the research groups Device Technologies and Service & Media Networks at Ericsson Research and made a prototype system for remote precision-control of drones assisted by haptic augmented reality.

Augmented reality is mostly talked about as something visual, but here we talk about digital elements that not seen, but in stead sensed physically. In this project we both controlled the drone and produced the sense of touch through a spatial haptic control device; a special joystick with powerful and precise motors for 3-axis force feedback. This made us able to feel the movements of the remote controlled drone as well as its interactions with real and virtual environments. For example: in order to limit the drone's access to certain areas we could place a virtual wall or other objects where the drone should not be able to go. When the real drone hit a virtual object, the physical characteristics of the impact is generated and played back though the joystick and it feels like a mechanically transmitted impact with a physical object. The drone itself behave accordingly, bouncing back in mid air as if hitting an invisible force field.

We can also add simulated characteristics to the virtual objects, much the same way as in a computer game; softness, hardness, vibration, magnetism, viscosity, different material textures, bounciness, stickiness etc.

The really interesting part is that the virtual elements can be guiding structures for steering. In fact, the haptic augmented reality can enable manoeuvres that would be virtually impossible if flying freehand. It is for example extremely difficult to manually keep a drone in an exact fixed position relative to another moving object. To simultaneously control another device that is attached to the drone, such as a camera gimbal or a robot arm, is essentially impossible for one single person to do.

Something like that could be made possible by using virtual forces in a haptic augmented reality layer. Let's say the job is to inspect a wind turbine blade. Place a virtual magnet on the blade and simply fly the drone towards it and snap on. Now the pilot can let go of the drone and focus on controlling a camera in stead while the drone hovers in the air at whatever position the virtual magnet is at. What makes this better than using the current techniques of combining altitude meter, downwards cameras and GPS, is that the virtual magnet can be attached to a real object that is moving. The magnet could also be shaped, for example as a rail along which the drone can travel.


These augmented elements provide a kind of dynamic in situ semi-automatic modes. The pilot can easily snap on to a magnet for situated semi-automation, or pull off to regain manual control. This is akin to a camera lens where you can just grab the focus ring and turn it manually to override the autofocus.

Haptic is actually a very interesting interaction design paradigm for augmented reality as it taps into our fundamental experience of the physical world. The interaction is therefore very easily understood without much cognitive effort at all. Its also fun!


The prototype system was designed and developed in-house at Ericsson Research, leveraging on a number of available open source components. The remote control and haptic feedback algorithms were developed using the Robot Operation System (ROS) and CHAI3D. The drones we used were Crazyflie 2.0 from Bitcraze with induction modules for charging. For positioning and tracking we used an Oqus 5+ motion capture system from Qualisys, who also was an outstanding technology partner for this project. Without them the project would never have happened.
Our haptic devices, aka joysticks, were developed by the open source project Woodenhaptics, developed by researchers at KTH and Stanford University.

We also developed software for fully automated push-button take-off and landing. Landing was even triggered when the drones' battery needed charging and included return-trajectories with dynamic object avoidance and landing precisely on an induction charger. The chargers we used were the Panasonic QE-TM101 with auto-adjusting induction elements which allowed for slightly off-target landings.


A case for 5G

The whole system and prototype is a lab only showcase. It doesn't work in the real world since the technology simply doesn't exist. Yet. For us this becomes a good challenge for developing future technology, since the combination of simultaneous haptic remote control, augmented reality including positioning and tracking as well as streaming of ultra high definition multimedia adds up to some pretty hefty requirements on the network. In addition to this the drones are battery driven and need their batteries for their motors, so it's not optimal if they also have to run heavy on-board computing. Hence it will make sense to run the entire operating software in the cloud, which we did in the prototype. This makes crazy-quick and reliable connectivity even more crucial.

Advanced long-distance remote control of drones using haptic augmented reality require a super-performance network with integrated cloud services, extremely low latency and strong security. Someone somehow have to solve the three dimensional positioning, and then of course there are regulatory aspects to this, but nevertheless — future mobile network technology starting with 5G will enable drones, airborne or other kinds, to become a global platform for remote operations. Drones with 5G would make it possible to replace a LED in a streetlight in China, pick cloudberries on a marsh in Sweden, replace a sensor on a fish farm in Norway or clean solar panels in Portugal. The same day. Trom home.

Yes, lots of future work will be automated and more or less autonomous machines will be used for an increasing number of tasks, but people will still be important as there will still be lots of tasks requiring human skills and judgement. And whether those tasks are done manually, in collaboration with a robot or other people — the possibility to perform practical tasks, humanly and skilfully, without the need to be physically present, may have power to transform many industries. Perhaps even global economies.

For more – BBC filmed and explained the demo quite nicely here (UK only), CNet too, and Ny Teknik (Swedish only) or you can watch Cristian below.

*live on stage with our CEO


The Ericsson Blog

Like what you’re reading? Please sign up for email updates on your favorite topics.

Subscribe now

At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.