Woman in front of screen wearing Varjo XR3 headset

Spatial Computing for Enterprise: Low-Friction is Key

Ultraleap’s hand tracking comes integrated with Varjo’s latest human-eye resolution headsets for professional XR – the XR-3 and VR-3. In this blog, we look at why reducing the learning curve is key to unlocking professional use-cases – and how hand tracking can help.

Imagine you’re in a brainstorming session in a virtual meeting space. You want to write on a post-it. Sounds simple? But before you can start writing, you need to press three controller buttons to spawn the post-it, then go into a menu system, then select pen mode.

Now imagine this situation under the time pressure of a one-hour meeting. Then add some people who’ve never used a VR collaboration tool before.

They may not even have had much experience of hand-held controllers. Not everyone is a gamer. There’s no guarantee a customer or time-poor C-suite executive will find button pushes intuitive.

It’s a simple example, but it illustrates just how low enterprise users’ tolerance of a learning curve is likely to be in many spatial computing use-cases.

Seeing your hands accurately represented in real time and able to directly manipulate virtual objects makes interaction in spatial computing lower friction and more approachable.

Now think how different the situation would be if your hands were accurately represented in real time and able to directly manipulate virtual objects. Instead of spawning a post-it by pressing a button, you would simply pick one up from a stack, pick up a pen, and start writing or sketching.

That sort of approachable, low-friction interaction is one of the key benefits hand tracking brings to spatial computing for enterprise.

Spatial computing needs the right tool for the right job

The evolution of input methods in 2D computing gives us some clues about how interaction will develop in enterprise VR and AR.

Starting with simple taps, touch gestures have evolved to become ubiquitous in 2D computing. Today, swipes and pinch-zoom have become instinctive for most people. This includes those, such as the very young, who struggle to use other input methods such as keyboard/mouse. Touch gestures reduce the learning curve and have created more efficient workflows.

Man with AR head set, with green particles in the workplace
In 2D computing we switch fluidly between different input methods depending on the use-case. The same is likely to be true of enterprise VR and AR.

Not every interaction is best done using gestures on a touchscreen. (I’m using a physical keyboard to type this blog, for example.) We operate in a hybrid world where we switch fluidly between touchscreen, keyboard/mouse, game controller, and more. We use the right tool for the right job.    

It’s the same in spatial computing for enterprise. Handheld controllers will likely always have a place in expert applications such as hands-on product design. Here, users expect to go through a learning curve.

In other applications, though, ease of use is at a premium. For use-cases such as remote collaboration, menu launchers, training, medical rehabilitation, or design reviews, it’s input methods that lower the barriers to entry which add value and improve ROI.

Why hand tracking lowers the barriers to entry for spatial computing

User studies have shown that using both your hands freely in VR meets people’s expectations of daily, natural interaction.

For an industrial design review for example, a customer or executive with no prior experience of the XR tool can interact, provide comments, or even reach out and move components before it has been turned into a model or prototype. R3DT’s VR tool for designing assembly lines, powered by Ultraleap’s hand tracking, is a great example of this.

The most striking point from our customers’ perspective is the simplicity of the software tool. That is driven by hand interactions from Ultraleap. Within five minutes, anyone can start working in VR. You not only experience and see the virtual environment, you reach out and work within the production line.
Andreas Ruedenauer - Managing Director and Co-Founder - R3DT

To decide if a gear stick in a car is placed at the correct location, you can reach up and determine if the CAD is correct. In some applications there are even features to do ergonomic assessments of products. Architects and their clients can immerse themselves in the building they are designing, before committing to final decisions.

In VR training, lowering the barriers to entry means higher throughput. This is because the end user does not have to be a seasoned gamer to start engaging with learning material.

One caveat is that the quality of the hand tracking is hugely important. Perceptible latency, hands that appear too slowly, hands that disappear, or hands that don’t seem to behave like real-world ones are extremely frustrating for users. This can affect the experience to the extent that poor quality hand tracking can result in an inferior user experience compared to controllers.

Hand tracking in spatial computing takes direct manipulation to the next level

When you touch and drag an app from one place on your smartphone to another, that’s an example of direct manipulation. The concept is fundamental to graphical user interfaces (GUIs). It means that you can see and act directly on elements of an interface using physical movements.

The term was originally coined in the early 1980s in contrast to command-line interfaces, such as Microsoft BASIC. In a command line interface, users have to remember and type in a set of commands to operate the computer.

Users find direct interaction more intuitive. It’s generated design trends such as skeuomorphism, which is when digital objects resemble real-life ones (e.g. your trash folder), and multi-touch gestures (e.g. pinch-zoom on touchscreens).

Integrating hand tracking into spatial computing applications takes direct interaction into a new phase. User experience designers can draw on rich, three-dimensional real-world affordances – such as the action of picking up a pen or post-it.

A recent study compared completing a task using hand tracking to using two different versions of VR controller. It found that hand tracking allowed users to focus mental effort on the task rather than how to operate the controllers.

Of course, not every VR interaction can be directly mapped onto a physical counterpart. Virtual objects also don’t behave exactly as real ones do. Design guides, software tools (such as Ultraleap’s Interaction Engine), and an evidence-based approach are needed to bridge the gap here.

Explore low-friction spatial computing for enterprise

If you want to explore how high-quality hand tracking unlocks new enterprise applications, our hand tracking technology comes integrated with Varjo’s latest human-eye resolution headsets for professional XR – the XR-3 and VR-3.

Our world-leading hand tracking software is fast, accurate, and robust with superior performance in key areas such as initialization, pose accuracy, and occlusion handling. The fifth-generation platform, known as Gemini, is also the first hand tracking platform from Ultraleap which does not depend on our hardware.

Varjo XR-3 and VR-3 are compatible with Unity™, Unreal Engine™, OpenXR 1.0 and hundreds of industrial 3D engines and applications, including Autodesk VRED™, Lockheed Martin Prepar3d™, VBS BlueIG™ and FlightSafety Vital™. They also come with a licensing model where users are fully licensed for commercial use of Ultraleap hand tracking.