Robots, AI and graphene – reflections on MWC Americas 2017
I just had the opportunity to meet with a true celebrity. He is handsome, extremely polite and has the biggest eyes that almost see right through me. And I met him at MWC Americas.
Oh, and he’s a robot. But more on him later.
I was in San Francisco to take part in a panel on Cybersecurity and IoT, a discussion continuing from MWC in Barcelona. Off course I was happy to take the stage, but the real thrill of being at these kinds of events is walking around a conference bulging with new ideas and technology. And so many toys, oh sorry, tools were available to try out. I got to test several VR headsets, Microsoft HoloLens, and stick my head into a cloud, a very physical one. Here are some of my takeaways.
Where can graphene not be applied?
Do you know about graphene? It’s not that different from what’s inside your pencil, only with characteristics that go far beyond. Consisting of a one-atom-thick layer, it can still be stronger than steel, transparent and flexible. At MWC Americas, there were several examples of graphene applications. One came from the University of Texas, presenting their work on invisible tattoos that can read your health status and send constant updates to your phone. Others included taking pictures in pitch dark using graphene-based sensors to superfast charging batteries.
Following the tech buzz at MWC Americas 2017
Of course there were also many different ideas on how to apply technologies, such as AI, VR and AR. This ranged from visualizing clustering of files and data in order to identify malware (ok, not 100% sure I understood how VR was beneficial here but it was interesting to watch) to indoor navigation and real-time AI-powered mobile search.
Regarding VR, the application that triggered my imagination the most was a system that let you control an industry robot at a distance. Controlling things using VR is not something new. And we at Ericsson have showcased this together with 5G many times. But still it got me thinking about the next step of combining IoT, 5G, VR and machine learning.
What happens when you can control multiple robots, which can also adapt, learn and evolve? It is like Kevin Kelly said in his book “The Inevitable”: “Our most important mechanical inventions are not machines that do what humans do better, but machines that can do things we can’t do at all. Our most important thinking machines will not be machines that can think what we think faster, better, but that think what we can’t think”
In the future we will have machines that can do both. Just imagine the distances, spaces and frontiers we can move in to.
Ok, maybe it just my brain going all Star Treky on me, but it does trigger your imagination, right?
How do you define AI?
The definition of AI technology is an often discussed topic and even on the panel “The future of AI,” there was an agreement that the term is too widely used today. For instance, several of the panelist agreed that Alexa, and similar assistants, aren’t AI at all. They are being programed to do certain tasks but not learning or evolving. To be called AI, the argument went, a system must have the ability to adapt, evolve and become smarter.
Do you know Pepper?
As I said at the top, the highlight of the event was meeting a shiny new celebrity. Pepper is a 1.2 meter tall robot developed by Alderbaran Robotics and Softbank. Pepper is programmed to interact with humans, to read your mood, analyze your expression and tone of voice and to connect with you. It was a thrill to finally meet him, after having read about his many different jobs. Still I can confess it was also a little unnerving when he looked straight into my eyes and told me he saw me.
I also spoke about IoT and cybersecurity at the Ericsson booth at the event. You can watch here.