The challenges to implementing AR in enterprise

By By John Kennedy, CEO, Adlens
Augmented reality (AR) promises to transform everything from design and manufacturing to surgery but it will never realise its potential unless fundamen...

Augmented reality (AR) promises to transform everything from design and manufacturing to surgery but it will never realise its potential unless fundamental performance and usability issues are addressed. According to a recent report from specialist AR/VR industry analysts Greenlight Insights, solving two fundamental optical issues would unlock an additional $10bn in spending on AR enterprise applications by 2026.

Currently, all AR experiences suffer from two major problems, vergence-accommodation conflict (VAC) and focal rivalry. These optical terms may be relatively unknown but anyone who has worn an AR headset for any length of time will recognise the effects: eye fatigue, inability to read text up close and struggles to complete precision tasks because real and virtual content is not well integrated. Greenlight’s analysts estimate that 95% of current AR applications would see an immediate benefit if these issues were solved.  

VAC breaks the natural way our eyes focus. Normally, as an object gets closer our eyes turn inwards to triangulate on it, stimulating the eyes to focus at the right distance. This doesn’t happen in AR (or VR for that matter) because lenses in headsets are set at a fixed focal distance. Our eyes are simply not fooled by clever software manipulation of virtual images. Systems need to be able to change the focal distance to place virtual objects accurately and convincingly in real space.

SEE ALSO:

The other major challenge, focal rivalry, occurs because our eyes can’t focus on real and virtual content together, unless they are in the same focal plane. Unless these issues are solved, virtual and real content will never be able to be accurately and comfortably integrated together and AR will struggle to be an all-day wearable tool for enterprises.   

Microsoft’s Hololens is one of the most advanced AR experiences available today but it is currently forced to advise all content developers to place virtual content beyond arm’s reach to avoid an uncomfortable experience. This limits our ability to engage, and work with content meaningfully.  There has to be a better way. 

The solutions lie beyond improvements in processing power, software and screen resolutions. Dynamic lens systems will make it possible to overcome these issues and place real and virtual objects together in an accurate and believable way.

By John Kennedy, CEO, Adlens

Share

Featured Articles

Infosys: European firms struggle to generate gen AI value

Research from Infosys forecasts that European companies will increase their generative AI investments by 115% in the next year, up to US$2.8bn

KPMG appoints Global Head of AI to drive AI strategy

KPMG marks next phase in its AI strategy with appointment of Global Head of AI and launch of global framework for design, build and of use of AI solutions

Google unveils Gemini, its largest and most capable AI model

Google says its Gemini AI model is built from the ground up for multimodality — reasoning seamlessly across text, images, video, audio, and code

Technology key to integrating sustainability into strategies

Digital Transformation

Hitachi Vantara addresses cloud demand with Google Cloud

Cloud Computing

Google delays launch of long-anticipated Gemini AI model

AI & Machine Learning