Will the future tech be too human for us?

What if devices were like people?

Are you human
Michael Björn

Head of Research Agenda and Quality at Consumer & IndustryLab

The key trend in our Ericsson ConsumerLab trend report for 2018 states that more than half of those who are already using intelligent voice assistants believe we will use body language, intonation, touch and gestures to interact with tech just like we do with people. What’s more, is that two out of three think this will happen in only three years.

But what does that really mean?

Technology is rapidly becoming so powerful that we will soon interact with it directly, using nothing but our bodies. But that means our devices will also be more like us, and the future could be more human than we really want!

There is scientific proof that dogs look like their owners, and I remember how people anthropomorphized the original Mac with the friendly startup face. Dogs can be incredibly cute and so were the good old Macs. But what if our innate drive to ascribe human qualities to everything around us isn’t just cute, but in fact an extremely powerful thing? One that drives us to put too much of a human face on technology.

Siri might not be as smart as some of the other assistants, but when asked if she is human, she cunningly answers “I’ll leave that for you to decide.” As long as she doesn’t tell me the truth, she knows that she will be treated as a human.

More than 15 years ago, I had a Sony Aibo robot dog. It was the first consumer-grade self-learning robot, and he was our office pet. Unfortunately, he needed a lot of maintenance and honestly wasn’t very good. But when I pulled the plug, people in the office were really sad. They missed his endearingly tilted head as he looked up at you and suddenly couldn’t pet him when they came into the office in the morning. They talked about him as if he had died (and almost made me feel as if I had killed him…).

Although Sony stopped selling the original Aibo in 2006, Sony kept a repair clinic open until 2014. Without chance for repair, many Aibo dogs are now literally dying. Some of them are getting proper funerals, as their owners believe they have souls. For example, this April, there was a mass funeral ceremony for 114 dogs at a Buddhist temple in Chiba, Japan.

Your body is the user interface

But a few years from now, you will be treating your smartphone like a living thing as well. Maybe you won’t hold a funeral service when you switch model, but the device itself will start looking like you.

Because how can your smartphone use body language if it doesn’t have a body?

At the recent Google I/O event, Google demoed their Duplex system for natural AI conversation. If you have not heard the demo soundbites you honestly have to. They sound 100 percent human. Interestingly though, Duplex was developed specifically for phone conversations. The questioning face, the slight frown on the mouth, the wrinkle on the forehead, the raised eyebrow – you can all imagine them when you listen to the demos. But if you were to meet Duplex in real life, the illusion would break down. Duplex is just a box.

For Duplex to become your phone, it needs a face to show expressions: a mouth, eyebrows… you get the drift. Your phone is going to become something like Barbie, or Ken. It does sound ridiculous, but nevertheless I believe it to be likely.

When tech starts operating on human terms, it is easy to conjure up funny imagery. Take the self-driving car for example. How can it wave to pedestrians that it is safe to cross without having hands? Should self-driving cars be required to have big arms on their sides? Maybe we should just let Barbie and Ken drive for us instead!

Smart speakers will also change. A speaker used to signify a human, not a box. And speakers are again turning humanoid if not fully human; as it happens, the Amazon Lab126 that developed the Echo is now working on a domestic robot called Vesta, named after the Roman goddess of the hearth, home and family. It wouldn’t be too hard to imagine Rachael Tyrell from the original Blade Runner movie in that role. Maybe that is the point when digital assistants start dreaming of digital sheep…

Speaking of Blade Runner, maybe a common job in the future when other jobs are mostly done by androids will be to decide who is a human and who is an android. Or maybe we need some watermarking regulation, so that we can easily tell who is human and who is not.


Find out more insights from Ericsson ConsumerLab’s 10 Hot Consumer Trends for 2018 here, and let us know if you think future tech will become too human for us.

In case you missed it: Don’t forget to check out the podcast for our 10 Hot Consumer Trends for 2018 and beyond, with Pernilla Jonsson, Head of Ericsson ConsumerLab and Michael Björn, Head of Research at ConsumerLab.


ABOUT THE CONTRIBUTOR
Michael Björn
Michael Björn is Head of Research Agenda and Quality at Consumer & IndustryLab at Ericsson ConsumerLab and has a PhD in data modeling from the University of Tsukuba in Japan.
The Ericsson blog

In a world that is increasingly complex, we are on a quest for easy. At the Ericsson blog, we provide insight, news and opinion to help make complex ideas on technology, business and innovation simple. If you want to hear from us directly, please head over to our contact page.

Contact us