Meta Reality Labs: Bringing touch to the metaverse

Technology Magazine explores how Meta/Facebook Reality Labs Research teams are inventing the future of interaction in augmented and virtual reality

One of Meta/Facebook''s Reality Labs Research team’s are working on a big goal: to create the technology to solve one of the central challenges of the metaverse: How do we touch the virtual world?


Who are Meta/Facebook Reality Labs?


The mantra here is that the future is 'more human, and less artificial'. Quite surprising you might think, but these guys are all about exploring the potential of human, believing that technology helps break down any barriers that divides us. They imagine a world where physical distance doesn’t limit our ability to connect and collaborate with one another and neither do the devices we use, ensuring friends and colleagues are always within reach.

Reality Labs brings together the brightest cross-disciplinary minds in one place to deliver their mission: build tools that help people feel connected, anytime, anywhere. Developers, researchers, engineers and designers all working together to help build a more expansive – and more inclusive – future for all.


New haptic glove research


The Reality Labs team have released a new type of glove that enters into new domains of scientific research. Seven years in the making, the Reality Labs team has pushed human-computer interaction forward across dozens of disciplines, creating new breakthroughs to make haptic gloves a reality. Here are a few examples:

  • Perceptual Science: Because current technology can’t fully recreate the physics of the real world in VR, the team is exploring the idea of combining auditory, visual and haptic feedback for things like convincing a wearer’s perceptual system that it’s feeling an object’s weight.
  • Soft robotics: Existing mechanical actuators create too much heat for such a glove to be worn comfortably all day. To solve this, they created new soft actuators — tiny, soft motors all over the glove that move in concert to deliver sensation to the wearer’s hand.
  • Microfluidics: The team are developing the world’s first high-speed microfluidic processor — a small microfluidic chip that controls the air flow that moves the actuators. The use of air (a fluid) means they can fit many more actuators on the glove than would otherwise be possible with electronic circuitry.
  • Hand tracking: Even with a way to control air flow, the system needs to know when and where to deliver the right sensations. Reality Labs are building advanced hand-tracking technology to enable it to identify precisely where your hand is in a virtual scene, whether you’re in contact with a virtual object and how your hand is interacting with the object.
  • Haptic rendering: Our haptic renderer sends precise instructions to the actuators on the hand, based on an understanding of things like the hand’s location and properties of the virtual objects (such as texture, weight and stiffness) that the hand comes in contact with.


Imagining the scene


To get into the dynamic of real world applications, you can place yourself in an interesting futuristic scene from the Facebook tech blog:

'You head to a table, but instead of pulling out a laptop, you pull out a pair of soft, lightweight haptic gloves. When you put them on, a virtual screen and keyboard show up in front of you and you begin to edit a document. Typing is just as intuitive as typing on a physical keyboard and you’re on a roll, but the noise from the cafe makes it hard to concentrate.

Recognising what you’re doing and detecting that the environment is noisy, the Assistant uses special in-ear monitors (IEMs) and active noise cancellation to soften the background noise. Now it’s easy to focus. A server passing by your table asks if you want a refill. The glasses know to let their voice through, even though the ambient noise is still muted, and proactively enhance their voice using beamforming. The two of you have a normal conversation while they refill your coffee despite the noisy environment — and all of this happens automatically.

A friend calls, and your Assistant automatically sends it to voicemail so as not to interrupt your current conversation. And when it’s time to leave to pick up the kids based on your calendared event, you get a gentle visual reminder so you won’t be late due to the current traffic conditions.'


Wider applications


Other applications include working on a virtual 3D puzzle.  Playing with a friend’s realistic 3D avatar, you can pick up a virtual puzzle piece from the table and feel it within your grasp. The sharpness of the cardboard’s edges, the smoothness of its surface...followed by a satisfying snap as you fit it into place. 

These comfortable and customisable gloves reproduce a range of sensations in virtual worlds, including texture, pressure and vibration. The long term aim is to pair the gloves with a VR headset and AR glasses for a totally immersive experience.

Learn more about the haptic glove project.


Featured Articles

Building Cyber Resilience into ‘OT in Manufacturing’ webinar

Join Acronis' webinar, Building Cyber Resilience into ‘OT in Manufacturing’, 21st September 2023

Google at 25: From a Search pioneer to AI breakthroughs

Technology Magazine explores how the tech giant went from being based in a California garage to a pioneer in technologies from AI to quantum computing

McKinsey: Nine actions for CIOs and CTOs to embrace gen AI

McKinsey identifies nine actions to help CIOs and CTOs create value, orchestrate technology and data, scale solutions, and manage risk for generative AI

OpenAI ChatGPT Enterprise tier drives digital transformation

AI & Machine Learning

Sustainability LIVE: A must-attend for technology leaders

Digital Transformation

VMware and NVIDIA to unlock generative AI for enterprises

AI & Machine Learning