Quadcopters and Contextual communication
“It would be cool to share a live video stream from a remote controlled quadcopter into a web application using WebRTC.”
This idea resulted in a summer internship at Ericsson Research Luleå on the topic of contextual communication, in which we investigated and implemented a prototype application.
The goal of this summer job was to develop a prototype application capable of streaming live video feed from a quadcopter (Parrot AR.Drone 2.0), controlled by a handheld device via WiFi, to a collaborative web application.
As we already know that video traffic will dominate, accounting for at least 50 percent of all traffic on the network and that video traffic will be driven by three types of video – consumption, creation and communication. Consumption of video has developed beyond television and YouTube videos. We have seen the increase of user-generated content for both consumers and businesses. Video collaboration is quickly being adopted for all platforms.
Okay, interesting, but what is the use case? Well, imagine a situation where the live video feed from a flying drone or quadcopter instantly can be shared with people (experts) via a collaborative web application for problem solving (remote locations, hazardous areas, etc.)
Figure 1: System Overview
First, we decided to use the Parrot AR.Drone 2 quadcopter which has an open API platform with shared source code, that was useful when investigating ways to direct video feed from the quadcopter to other devices than the controlling device.
Secondly, we wanted to share the live video feed in a web application framework called Spaces. Spaces is a framework, developed at Ericsson Research, for building a new class of web applications for personal communication and collaboration designed around a shared room metaphor. A Space is like a shared room that can be dynamically configured and contain relevant tools, data and users, needed for the current context.
Thirdly, we used OpenWebRTC, running on a Ubuntu machine, to forward the live video feed from the quadcopter to a shared Space and finally, we used a Widget available in the shared Space that activate the live streaming. A Widget in the Spaces context is a web application provided with an API to make it possible for them to provide a shared experience among all users in a Space.
Giving the participants partial or full control of the quadcopter is a very doable addition to this prototype since Spaces have excellent support for a collaboration experience. The documentation, via the Parrot open API platform, of the commands for controlling the quadcopter is also excellent.
Since the OpenWebRTC framework was developed for embedded devices, one could argue for having the quadcopter adopt this prototype together with a 3g/4g modem. The quadcopter have support for a USB memory stick for recording the video stream which could be an alternative for using a USB 3g/4g modem. The question that needs more investigation is whether the quadcopter’s computer is powerful enough or if it is possible for the quadcopter to carry a Raspberry Pi in addition to the modem.
In the end, this summer student project allows us to create an understanding and continue the dialogue about the technical challenges and requirements that are needed including latency, video quality etc. In this conversation, we can discuss the possibility that you could look at video in itself as a rich sensor. If you think about video as a sensor, or video as an information flow, there is a lot of information in a video feed.