Share

*Per Pablo Saez

In November of last year, less than a month after the announcement of the change of name of his company, from Facebook to Meta, Mark Zuckerberg presented in social networks the advances of the research laboratory that he baptized as “laboratory of reality”.

What the video showed: on one side, a gloved hand on an empty table; on the other, an image of a digital hand on a table with several objects, also digital. While the hand with the glove moved in the void, the digital hand reproduced the movements, picking up the virtual objects.

The question that remains is: why is a glove so important to the metaverse, and why so much fuss for something we've all seen in VR videos and even experienced ourselves years ago? The key is in the sense of touch and bidirectionality.

The Senses and the Metaverse

We can understand the metaverse as a 3D internet where there will be a new digital world in which we will be able to enter, interact and feel the sensations of this new world, as well as “bring” virtual entities into our physical world, showing and feeling them. in the real world. These are two-way interactions, providing physical experiences in the virtual world — and virtual experiences in the physical world, simulating a single integrated world.

The main characteristic of the metaverse will be the capacity for natural and continuous interaction between entities from different worlds. For this, we must “trick” our brain, making it feel digital experiences as real. For that, we have to be able to “hack” our senses. We believe there are two ways to do this:

Directly in the brain: several researches are being conducted to create brain implants that simulate in our physical body the signals from our five senses received when we are inside the metaverse. In a more distant future, directly stimulating the brain will be the best mechanism for simulating the senses. However, for this to happen, we need to decipher in depth how our brain works.

In the proper sense: to date, the simplest and most direct way is to trick the sensory apparatus at the source, not the brain. If we can simulate digital reality in front of the sensitive device, we can make the brain feel it as real. We can cite some examples: a video in front of the eyes, a sound in the ear, a resistance in the hand, a wind in the face or, who knows, a taste on the tongue. The better the simulation, and the more senses we can deceive at the same time, the more real the digital experience we create for the user.

The success of the metaverse will depend on numerous factors. One of the most obvious is that interactions must deliver an excellent user experience. The interactions between the two worlds will be constant; therefore, a bad experience would doom any attempt at integration. The more real we feel digital interactions, the greater the engagement with this new world.

The sight (and the ear) in the metaverse

There is still a lot of research work and technological advances for the metaverse to reach all its capabilities, but we already know that the environment, the ecosystem, the place where the interactions took place is in the so-called extended reality (XR in English). This is where we need to succeed by tricking the brain.

To better understand extended reality, we have to talk about its three types, which vary according to the level of interaction:

Virtual reality (VR): is the best known of all. In it we wear helmets, glasses or headphones that transport us to the virtual world. They are an advance in deceiving eyes and ears, but we are still a long way from simulating the senses and completely deceiving the brain. This occurs today because there are limitations of peripheral vision, real-world attention deviation, latency and simultaneous synchronization, in addition to the control challenge that disables the hands for other tasks.

Augmented reality (AR): aims to bring digital objects and experiences into the physical environment (as in the film Minority Report) through the creation of visual holograms in the face of reality. In AR, we are far from fooling the brain to the point of making it “believe” that a digital goal in front of us is real; but there is research with light glasses to superimpose views. In the future, we may have “smart” contact lenses that will allow this superimposed and integrated view with the real world.

However, even as we advance in the ability to deceive the eye and ear, for a total and natural immersive experience, we need to deceive the brain with more senses; so we need tact. And that's where the glove presented by Meta comes in, through which we can, in addition to seeing how we move objects, actually feel how we pick up and release them.

The importance of touch (and the ear) in the metaverse

A problem that goes unnoticed by users, but that is extremely important for a successful experience is the feedback signals. When we use a keyboard, we receive two stimuli of feedback with each keystroke: sound and touch.

Now, think of a virtual keyboard or the screwdriver from the previous example, we can draw them digitally and place them in front of our eyes. Until then ok. However, when tightening a screw, for example, we would not have these feedback signals. Without sounds or touch, the brain will not process and the experience will be frustrating.

There is a whole field of study related to visual and tactile signals, haptic in English, so that the digital experience is as close as possible to the real thing. Returning to the example of the real screwdriver that tightens a digital screw, in addition to sounds and touch, the experience needs to realistically simulate the resistance of the screw being tightened or loosened.

Hence the relevance of Mark's video glove. It's not just a glove to use as a user interface in the metaverse and to move our hand in the digital world. It's a two-way glove that, in addition to capturing our movements, will reproduce the feedback from the digital world back to our real hand, taking the experience to another level.

However, let's think of the glove only as a first step. For the experience to be fully immersive and natural, the entire body needs to feel that it is somewhere else, and that it is receiving these haptic signals. Let us not think of a glove for the hand, but of a complete overalls that transmits these signals to each part of the body. By putting the monkey together with glasses and headphones, we would be able to trick the body into making it feel like it's flying, as in the movie. Player No. 1.

Haptic feedback is useful, but the latency time (the JND — Just Noticeable Difference) between our action and our perception in the virtual world of the reaction is critical, with a millisecond considered the maximum time for us to perceive the result. of our action. The challenge here is multiple, but it is directly conditioned by network latency and processing time.

User interaction and experience

We already understand the importance of the senses and the need to simulate them with maximum realism in the metaverse. Technology needs to evolve, in parallel with improving user experience techniques. In this aspect, one of the factors that will greatly condition the interactions are the control mechanisms that, by definition, must be portable.

Currently, we interact on the internet using the keyboard, mouse or control pad, which we use, for example, to move our avatar through the digital world. For the experience of moving and interacting with digital goals to be immersive, we need much more than that.

In a first phase of evolution, in addition to using glasses and headphones, we would use hand movements for interactions, which would totally transform the experience, taking it to another level. However, this type of control is based on computer vision, a technique that is not yet perfect and requires high processing power, so it would not be immersive 100%.

The ultimate goal, or consumer dream, would be overalls on the body (as in Player No. 1) that capture the movements of the entire body and reproduce them on the avatar, as well as digital cloths and fabrics that would enable a new series of completely immersive interactions.

Another field of study is MR/AR eyewear. Advances in this area are oriented towards enabling immersive digital interactions without impeding the visualization of reality, as well as increasing the field of vision trying to imitate human peripheral vision which, with current devices, is very limited and a critical factor for us to adapt. its use on an ongoing basis.

Thinking about a more distant future, it seems obvious that the ideal would be digital and smart contact lenses, which would avoid all the impediments mentioned here of visibility of reality and peripheral vision. Importantly, we still need a lot of progress to make the metaverse a reality.

*Pablo Sáez is a leading partner in Digital Technology at NTT DATA 

Notice: The opinion presented in this article is the responsibility of its author and not of ABES - Brazilian Association of Software Companies

quick access

en_USEN