<img height="1" width="1" style="display:none" alt="" src="https://analytics.twitter.com/i/adsct?txn_id=o7u1o&p_id=Twitter&tw_sale_amount=0&tw_order_quantity=0" /> <img height="1" width="1" style="display:none" alt="" src="//t.co/i/adsct?txn_id=o7u1o&p_id=Twitter&tw_sale_amount=0&tw_order_quantity=0" />
Woman and man wearing VR headset using hand tracking to interact with a holographic jet engine

How Natural Interaction will Power the Metaverse

Posted; November 17, 2021

For physical and digital worlds to converge, the way we interact with them has to converge too. Today, we announced a £60 million ($82 million) Series D investment – a massive vote of confidence in our vision of the future of interaction. We’ll be using this funding to help our customers and community revolutionize how we connect, create, and explore our world.

By 2050, if you walk into a room full of people, you will be able to assume everyone has some form of augmented vision. This will enable them to see three-dimensional digital content in the physical world, or fade their surroundings out at will and enter a virtual world.

You can compare this to walking into a room in 2021 and assuming everyone has a smartphone.

The boundaries between digital and physical will be porous. Digital content will not be 2D pop-ups and notifications fixed on a plane in front of your eyes. Rather, it will be full 3D objects, creatures, and even people existing and moving seamlessly around physical space.

You will not be able to tell what is real and what is digital with your eyes alone.

This is what I think of when someone says “metaverse”. Yes, the word is overused and overhyped, but there’s a reason for that. It’s useful. It’s a concept that brings together many developments already happening into a cohesive vision.

The metaverse isn’t a VR headset, though they will certainly play a big role. It’s a new computing platform, in which digital content is predominantly 3D and blended into the physical world.

Or, as Tony Parisi puts it, “a global network of spatially organized, predominantly 3D content… available to all without restriction, for use in all human endeavours ­– a new and profoundly transformational medium, enabled by major innovations in hardware, human-computer interface, network infrastructure, creator tools and digital economies.”

See that “innovations in… human-computer interface” piece of the puzzle? That’s us. That’s Ultraleap. And that’s why we’ve just raised a £60 million ($82 million) Series D round.

Lufthansa VR flight training with ULtraleap hand tracking technology
Ultraleap hand tracking in action today in Lufthansa/NMY’s VR flight attendant training.

We’re excited to welcome some significant new investors including Tencent, British Patient Capital through their Future Fund: Breakthrough programme, and CMB International. We’d also like to thank Mayfair Equity Partners and IP Group plc for their continued support.

For physical and digital to converge, we believe the way we interact with them has to converge too. This investment is a massive vote of confidence in that vision of the future of interaction.

The human-computer interface of 2050

The metaverse of 2050 is a place where anyone, anywhere, can interact with digital content as fluently as they interact with physical objects. Where your workspace is virtual, portable, and infinitely customizable. Where you can train in open-heart surgery in the back of an autonomous taxi, test drive a new car without leaving your living room, or play with a LEGO set while stood outside the shop window. 

This content won’t be accessed through the small window of a smartphone or laptop screen, using touchscreen or mouse.

LEGO interactive billboard
Using Ultraleap’s technology to virtually reach into a billboard and move, rotate, and feel LEGO bricks to build models on screen. (Ocean Labs at Westfield Stratford City mall, London, UK, 2020.) The seedlings of the metaverse are already growing in all sorts of places – and not just in VR.

It’s not even possible to imagine that gaming controllers will be the primary interface. If you can’t tell what is real and what is not with your eyes, how would you know when to use them?

The metaverse will depend on more natural forms of interaction.

From birth, every human naturally interacts with their hands and voice. A baby instinctively reaches out to grab a toy block.

As we grow older, we supplement these with specialized tools for specific projects (a pencil for writing, a drill for DIY…). Hands and voice, though, remain our primary interfaces – our launcher, as it were, for everything else.

Stay up to date with the latest developments in human-computer interaction

The same will be true of the metaverse. The primary interfaces will, inevitably, be hands and voice, supplemented by specialized controllers for specialized applications.

Voice recognition is a known problem with many solutions being refined by many companies. Hand-based interaction is harder, and expertise in it much rarer.

To be able to use your hands to manipulate digital objects as if they were physical requires two things. Accurate, reliable recognition of what your hands are doing, and the ability to feel what you touch. In other words – hand tracking, and haptics.

What’s next for Ultraleap?

Physical and digital interaction won’t suddenly merge, overnight, sometime in 2050. It will be a steady but inexorable convergence.

This will be driven not by the dreams of tech companies but by the tangible benefits real people find at any given moment in time. The seedlings of the metaverse are already growing in all sorts of places – and not just in VR.

Being able to interact more fluently with digital infotainment systems makes a measurable difference today to driver safety (and driver experience). Elsewhere, the pandemic dramatically brought into focus the limitations of touchscreen interaction, catalyzing demand for alternatives.

Gesture control by Ultraleap inside luxGroupe PSA DS Automobile concept car with Ultraleap
Ultraleap’s mid-air haptics and hand tracking are integrated into the cabin of the DS Aero Sport Lounge luxury concept car (Stellantis). The design team wanted to improve driver experience by reducing the use of touchscreens – making gesture control the natural choice.

This year, Ultraleap has already released Gemini – our fifth-generation hand tracking platform and the most robust, flexible hand tracking software ever. Our hand tracking and haptic technology are used by over 350,000 developers, and some of the world’s most innovative companies – including Qualcomm, Varjo, Stellantis, Autodesk VRED, PepsiCo and Acer.

But we know there’s more to do. We have so much planned to make building with hand tracking and haptics even easier and better. This includes:

Thank you to our customers and community

It’s been a privilege to work with and learn from our amazing customers, community, and investors. We couldn’t have got where we are today without your belief, your support, and your feedback.

It’s natural interaction that will transform digital constructs into living, breathing, accessible worlds that people want to live in.

We can’t wait to build them with you.

Tom Carter started exploring ultrasound technology during his final year of his Master’s degree in Computer Science at the University of Bristol, UK. Recognizing the technology’s commercial viability, he founded Ultraleap in November 2013 and launched the world's first and only mid-air haptics solution. Since then, Tom has gone on to launch commercial hand tracking and haptics products and lead technology innovation at the company first as CTO, and now as CEO.

Woman and man wearing VR headset using hand tracking to interact with a holographic jet engine

Interspatial newsletter

Our regular newsletter keeps our community up to date with the latest developments in human-computer interaction across XR, automotive, and touchless interfaces.