Meta is making a real-life glove that can hold virtual items

- Advertisement -


Meta, formerly Facebook, is developing a glove that allows users to touch objects in virtual reality.

- Advertisement -

Haptic gloves can clearly recreate the feelings of texture, pressure and vibration when interacting with a digital environment – ​​though the company says it’s still in the “early stages of research.”

The ultimate aim of the Meta is to pair the glove with a VR headset to simulate play at a concert or poker game and eventually work with augmented reality glasses.

advertisement

Facebook has been working on its own smart eyewear called Project Area for some time, but it looks like its 2021 launch date is getting delayed.

“The value at hand is enormous when it comes to solving the problem of interaction in AR and VR,” said Sean Keller, research director at Meta Reality Labs.

- Advertisement -

“We use our hands to communicate with others, learn about the world, and take action within it. If we can bring full presence to AR and VR then we can benefit from a lifetime of motor learning. can.

“People can touch, feel, and manipulate virtual objects just like real objects—all without having to learn a new way to interact with the world.”

To do this, Meta would need to combine auditory, visual and haptic information to ‘trick’ the brain into believing the virtual world.

Meta says the gloves will eventually need to be “stylish, comfortable, affordable, durable and fully customizable.” In practice it is difficult to overcome that challenge.

Gloves are made with hundreds of tiny actuators — tiny motors — that operate synchronously but are currently too large, expensive, and hot to operate. Meta could theoretically replace these with softer ones that change shape as the wearer changes, but these don’t exist yet.

The company is currently researching and developing the technology, with an emphasis on weight and speed, as well as creating the necessary software that accurately simulates real-world physics.

“If I pick up a cube, I already have an idea of ​​what kind of material it is and how heavy it can be,” says meta research scientist Jess Hartcher-O’Brien.

“I understand this, I verify the material, so I am adding visual cues about its physical properties and the haptic feedback that comes from that first moment of impact. When I go to manipulate the object, my The brain recognizes frictional forces and inertia and can detect how dense or heavy this object is. My visual system is updating based on how it perceives my arm’s movements. proprioception Tells me where my hand is in space, how fast it’s moving, and what my muscles are doing.”

This is also where technology like hand-tracking comes in – something is already built into Oculus headsets – to deliver information in the right field. In the future, it is possible that Meta could introduce ‘haptic clicks’ with virtual buttons or ‘haptic emoji handshakes’ to meet users in the metaverse to meet people they know.

Meta CEO Mark Zuckerberg has consistently touted the Metaverse as the future of Facebook, especially in light of the many scandals that plagued the app and others like Instagram.

Mr. Zuckerberg has said that an “embedded Internet” is “engag.” will be focused on[ing] more naturally” with behaviors we’ve already exhibited – such as reaching for our smartphones shortly after waking up.

Whether or not the company can effectively manage the metaverse — or at least, a part of it — remains to be seen. In a leaked memo, Meta’s CTO Andrew Bosworth stated that the company’s products should have “nearly Disney levels of security”, but that virtual reality can often be a “toxic environment” and “mainstream customers should be completely exposed to the medium”. can push”.

However, he also noted that it is “practically impossible to control on any meaningful scale” the speech and behavior of users.

,

Credit: www.independent.co.uk /

- Advertisement -
Mail Us For  DMCA / Credit  Notice

Recent Articles

Stay on top - Get the daily news in your inbox

Related Stories