What if the tech singularity is a reality explosion?
Many AI researchers believe we are heading towards a tech singularity in the form of an intelligence explosion – the point at which AI will independently (and rapidly) self-improve. Beyond that point we cannot predict anything, since this super intelligent AI will surpass humanity.
But what if the tech singularity is something totally different? What if it is a reality explosion: a point beyond which we cannot predict anything, since reality can be anything imaginable? To put it another way, what if we could no longer tell the difference between a physical item and a digital one?
According to our latest consumer trends report, many early adopters of AR and VR think this will happen. We call this experiential effect Merged Reality. Consumers predict that the first merged reality experience will be found in gaming; more than 7 in 10 respondents in our report believe VR game worlds will look indistinguishable from physical reality by 2030. After spending over 2,000 hours in VR myself, I am inclined to agree.
AR and VR are popular technologies that currently make it possible to mix digital experiences into physical reality along the so-called reality-virtuality continuum (below). But in order for such realities to merge beyond meaningful separation, other technologies will also be needed. Let us tread the path where the lines between the physical and virtual world begin to blur.
On the most basic level, physical and digital experiences are already merging for all of us. Most everyday activities, from talking with people to shopping, are becoming a spaghetti-like tangle of on and offline activities. Ten years ago, we still divided the world into two – our physical existence, and its digital shadow. We used to call these halves ‘offline’ and ‘online’, but in fact these words no longer carry much weight, and are falling into disuse. As long as we manage to buy that shirt or talk to that person, offline or online doesn’t matter anymore.
Today we could still make those distinctions if we wanted. However, in the future we might not be able separate them. I have already had several such experiences, and I would like to share them with you.
1. Immersive body motion
The first experience I had was playing A Chair In A Room: Greenwater in room-scale VR (meaning that when you move around physically in a room, your movements are reflected in the virtual reality world you are exposed to).
It turns out that what causes the extreme immersion in this game is its ability to dynamically resize the virtual world to fit the actual physical space you are moving around in. You physically move around, pick up items and interact with the world around you over the course of several hours of psychological horror. In the story, you start out as a patient in some sort of mental ward and then find clues about your own past.
Back then, in the autumn of 2016, I had not experienced anything similar before, and soon got so immersed in the experience that I literally could not separate it from fantasy. As a result, I got extremely scared; incomparably much more scared than I ever have been from watching a horror movie. I have tried a few more horror games in VR since then, but it is almost as if I am traumatized beyond the point of logical reason: I go in repeating ‘this not true, this is not happening’ under my breath, but soon my pulse is so loud and my legs are so shaky that I cannot continue. I can tell my intellect that it’s just a digital game, but my instinct won’t listen; at some level my mind is convinced that VR is reality and I cannot shake that off.
2. The importance of spin and impact
My next experience of truly merged reality was with VR table tennis. As a teenager, I played ball sports every day, and among other things I was in one of the national table tennis leagues. It all came back to me very quickly when I fired up the table tennis game Eleven in my VR headset. Again, it was soon clear that my body fully accepted the digital ball and table as physical reality; I was simply playing table tennis.
This time round it wasn’t so much innovation in movement that made the difference, but rather the lucky coincidence that the hand controller felt like a table tennis racket, with the position sensor on the top of the controller roughly creating the same balance that the paddle area does in a table tennis racket. In addition, the vibration feedback in the controller managed to convincingly recreate the feeling of a table tennis ball on impact.
A year later, I was in Japan advising students on thesis work and one evening a table tennis tournament was arranged. As it happened, several of the students were pretty good at table tennis. But even though I hadn’t physically held a table tennis racket for 35 years, I managed to come second in the tournament, thanks to all the hours I had played the game in VR. There is no such thing as ‘muscle memory’, it all really happens in the brain, and my brain had firmly made the connection between my VR practice and the real thing. The physical modelling of the spin and speed of the ball, as well as the impact of the racket was good enough.
3. Modelling the ludicrous
However, just because something is based on physics modelling, it doesn’t have to feel real at all, for example the game Gorn which describes itself as a ‘ludicrously violent VR gladiator simulator’. That says it all, really.
Nevertheless, in Gorn those ludicrous fights are modelled physically, meaning that when you hit someone, the impact is not animated, but instead calculated using parameters such as speed and impact area. Hence, the fight is not scripted like in a traditional game where a limited number of animations are played again and again, based on what you do.
This leaves an opening for creativity on your part. You can use items not actually intended by the game creators as weapons in the fight. Let us say, for example, that an opponent drops his helmet. You might then pick it up and hit him on the head with it. If the physics modelling is any good, that should work, and in Gorn it certainly does (but given the game’s ludicrousness, ripping off the opponent’s arm and hitting him with that also works).
Interestingly, physics modelling like this exposes some surprising limitations with current generation VR. One of the key features of Gorn is that weapons are elastic. If you pick up a spear and press it on to the side of an opponent, it will bend as if it was made of rubber. The underlying issue is that the spear is not big or heavy enough to move the opponent, and if it just stops at the opponent’s side while your hands continue the physics modelling breaks down. It would also be meaningless to have the weapon passing through the opponent without doing harm. So in Gorn, the weapons bend! Ludicrous…but mathematically sound.
4. Free motion physics
Another game with full physics driven melees that has taken the VR world by storm is Blade and Sorcery. However, this is a game that tries to be realistic and avoids rubbery weapons. Instead, it blocks any motion that is physically impossible. If you hit a wall with a hammer it will stop, and if you continue to push, your wrist will digitally bend even though your physical wrist remains straight.
The game also includes free movement which in combination with physics modelling adds a whole new level of immersion. You can use the physics in new creative ways, for example, to utilize an axe as a hook to pull yourself up on a ledge. Again, this is not something that needs to have been scripted in advance by the programmers – or even imagined possible by them – in order to work, as long as the physics modelling is good enough. And in Blade and Sorcery, it certainly is. The game even includes the modelling of weight; weight of weapons, of enemies and to a certain degree, of your own body.
When I first tried Blade and Sorcery, I was totally amazed, and the freedom felt fantastic. But after a while, the VR limitations become painfully obvious. If you run into a wall, the digital world around you stops. So even if you continue moving physically, or if you spam the ‘walk’ stick on your hand-controller, nothing happens. Also, if you take a spear and press it into that same wall, the spear doesn’t move even if you can digitally move forward until your digital face hits the wall. In both cases, physics modelling is simply halted, and immersion is broken.
5. VR push-back
The game where all previous advances come together is Boneworks, a full-featured, multi-hour physics modelling based adventure. It also adds a unique feature; the modelling of push-back. By this I mean that you are pushed backwards in the digital space if you push too hard forward on something that won’t budge. In Boneworks, push-back physics are applied not only to movement, but also to the weight of objects, creating a new level of realism and immersion.
When Boneworks was released late in 2019, it spawned frenetic activity in the VR community online, with people exchanging tips about how to find shortcuts by climbing, how to use heavy items to block doors that were supposed to close once you passed through them, etc. In short, everyone tried to play the game in ways that had not been foreseen by the creators – and so did I. But although I pride myself on my high resistance to VR nausea, like others I also noticed that the game made me sick.
The issue, I believe, is a consequence of the push-back modelling. In order to keep the physics intact, there are multiple types of micro push-back all the time; when I lift an item that is too heavy to lift at the speed I am lifting it, when I am climbing and try to haul my body upwards in an unrealistic way, when I interact with enemies in ways that affect the physics, etc. The sum of all that constant push-back seems to create motion sickness.
Nevertheless, I still love the game, and after a few minutes of breathing in order to overcome my nausea, I always dive back in again.
Boneworks has been hailed by many as the first true next-generation VR game. However, as far as I am concerned, it is rather an indication of how far away we are from the next generation of truly immersive technology. We will need devices that can handle muscle impulses and neural signals in order to solve these challenges. Higher resolution displays, lighter VR headsets and haptic gloves just won’t cut it.
6. Mobile reality
So what if this level of realism comes to AR, and what if it goes mobile?
It is relatively easy to see how playing table tennis would translate into AR. While your friends are physically present, you digitally conjure up a table, rackets and balls and play in thin air as it were, although the experience is just like the real thing. Seeing how shooters translate into AR is also easy, we already have The Walking Dead for mobile phones, and over time such experiences will evolve to become much more realistic.
I would love to see AR physics move to the point where my friends aren’t physically there, but I could shake hands with them and that handshake would feel real. What might be even more intriguing than anything else is how games like Minecraft Earth could eventually merge with physical reality. The idea of modelling object weight already happening in VR would translate nicely into a city building AR game. Although for that to work we really need a totally different kind of AR than we have today: one that incorporates all our senses. As it turns out, a majority of current AR/VR users believe we will have such technology by 2030. Collectively, we call those technologies the internet of senses, and that is also the main theme of our 2030 consumer trend report. Of those who want an internet of senses, as many as 40 percent see immersive entertainment as a main driver for this change, so thinking about games might indeed be the best way to understand if we are heading toward a reality explosion or not.
The reality explosion
Imagine a city center where some buildings are digital and others are physical. Imagine also that you neither know nor care which is which, since you can’t tell the difference anyway.
I find it hard to believe that cities will look like that in just ten years – but I have been wrong before! What if we suddenly see AR/VR and 5G connect with advances in human brain interfaces, electromyography, light field rendering and digital twins to create a runaway combinatorial effect? A lot could happen in a decade.
Obviously, AI would be a big part of a reality explosion as well. Already today, AI is used extensively for graphics processing, and the results are very realistic looking indeed. Deep fakes are a telling example of this, and they are becoming possible to construct in near real time. AI might even be better at constructing digital realities than at reconstructing – or surpassing – the human mind. In fact, AI might cause a reality explosion long before it is powerful enough to cause an intelligence explosion!
If you are comfortable with the idea that the big bang was the initial singularity, the one that created reality itself, then maybe it would only be fitting that the tech singularity dismantles that reality. In that scenario, the reality-virtuality continuum I mentioned initially would fold back onto itself, and all realities would exist at any single point in space.
Read our 10 Hot Consumer Trends 2030 report to find out how the internet of senses is shaping consumer expectations.
Apr 08, 2020 |Blog post
5G, Edge computing, Networks
Sep 04, 2023
Like what you’re reading? Please sign up for email updates on your favorite topics.Subscribe now
At the Ericsson Blog, we provide insight to make complex ideas on technology, innovation and business simple.