You must accept cookies to be able to make a comment.
Transformation through mobility
Awareables are not necessarily wearables. Instead, they are what I would call next generation devices that are aware of you, whether you wear them or not. How would you feel about a device that knows you better than you know the device?
I, for one, have already felt a bit spooked. Here’s my story:
When I first got my latest iPhone I just marveled at how sleek and nicely built it was. I even quite liked the strange notch at the top of the screen, for the simple reason that it makes the phone stand out from other phones now that they are all basically plain slates of glass. I set up the face ID function and struggled a bit with not having a home button, but after just a few days it was all second nature.
It wasn’t until a couple of weeks later that something odd happened. There had been a bit of a disagreement at home, and well, my eyes were a bit red and my face a bit puffier than usual. At first, the phone wouldn’t unlock for me.
Suddenly, I had this strange feeling that the phone didn’t want me to use it in the state I was in. I felt very self-conscious, and even slightly offended as if someone had disturbed me in a private moment, wiped my eyes and tried to present a composed face and “look normal” as if I were taking a photo for an ID card or passport.
The phone eventually unlocked, but I could not shake the feeling that the phone was watching me and making a judgment.
From that point onwards, I always seem to feel just a little self-aware when unlocking my phone. I have even gone as far as asking Siri “Do I look happy today?” Siri, answered ambiguously: “I’m guessing pretty good. For a human.” Normally, I would dismiss such an answer, but now I can’t help but think Siri is holding something back.
At some point, I came across a couple of Animoji karaoke videos on YouTube and thought they were fun. So I decided to try myself. After that, I also started pestering friends with Animoji messages. It was a lot of fun and the device really captures all your facial expressions in real time.
But then it struck me. The iPhone X is already aware of my mood – it can make happy, surprised, perplexed and sad faces exactly mimicking my own. But presently, all it does with this amazing technology is turn me into a singing turd. Not a big step in my case.
But the next phone could do much more than that. Imagine for example, that you come across a very surprising post on your social network. The app would note your surprise and immediately do a fact check of the post for you. That’s something we explore in two of our 10 Hot Consumer Trends of 2018.
Or, let’s say you are looking at a YouTube clip of a band playing. Just as you are getting tired of it and are about to stop the clip, the band goes into an uber-cool syncopated section and you perk your ears. The next clip YouTube suggests to you then has a similar tempo shift although it is by a different band.
Or, let us say you are shopping for T-shirts at Amazon. Now that the app is aware of you, it will use your spontaneous reactions in order to better know exactly what you want.
Or – imagine that your phone blocks an incoming call because it is aware that you are in the bathroom…
The applications are endless.
Awareables are not only going to be phones. A similar development is already happening in VR, where eye-tracking is being employed to increase the perceived picture resolution by enabling field of view (FOV) rendering. In other words, by tracking where you look, only the part you are looking at needs to be drawn on the screen and not everything else around you.
But when VR headsets become aware of what you are looking at, there are tons of other applications. In social VR you could make eye contact, real eye contact, with other people; and in games you could make eye contact with game characters. Aiming with guns or throwing things could be much improved, and VR-based retailers could improve and personalize product selections by knowing what parts of product designs that actually draw your attention.
And then, we haven’t even mentioned all the benefits to advertisers.
When devices become aware of us, it will also be much easier to share them with others. Take smart speakers for example – not only would they be aware of different family members and handle requests differently, they would also be aware of guests as well and be able to treat them individually if they are already using the same speaker at home. This awareness based multi-user interface will eventually be extended to public devices that interact differently with each and every person based on who they are, such as for example at an information desk in a shopping mall.
But before we pronounce the sky as the limit for the coming era where devices know us better than we know them, there are also some serious problems.
The most obvious issue with awareables like these is of course the increased privacy dilemma. At Ericsson ConsumerLab, we have looked into privacy several times, and we will certainly be coming back to it!
A more subtle issue is how humans react when it becomes difficult to differ between people and machines. Although more than half the current users of intelligent voice assistants believe we will soon interact with devices using body language, intonation, touch and gestures just like we do with people, it doesn’t mean we will treat devices as humans.
Quite to the contrary.
As we point out in our most recent Ericsson ConsumerLab trend report, machines that mimic human communication can make people feel surprisingly awkward. In fact, 50 percent of respondents said that not being able to tell the difference between human and machine would spook them out. Even more interestingly, 40 percent also stated that it would be spooky if their smartphone sees when they are happy, sad or bored and responds accordingly.
Personally, I am even more spooked by the fact that my phone already knows my mood and does not act on that knowledge!
Awareables are definitely coming. The question is, how will we respond to them? In our research, as many as one in three would already today like to wear glasses that make it impossible for facial recognition software in their smartphone or social network to recognize them. Will benefits outweigh issues?
Go deeper inside Ericsson ConsumerLab’s 10 Hot Consumer Trends for 2018 here, and let us know if you feel comfortable with technology that is more aware of you than you are of it. Also, don’t miss out on Michael’s podcast with Bloomberg Markets’ Carol Massar and Cory Johnson on how artificial intelligence will impact our daily lives.
In case you missed it: Don’t forget to check out the podcast for our 10 Hot Consumer Trends for 2018 and beyond, with Pernilla Jonsson, Head of Ericsson ConsumerLab and Michael Björn, Head of Research at ConsumerLab.
You must accept cookies to be able to make a comment.