The challenges to implementing AR in enterprise

By By John Kennedy, CEO, Adlens
Augmented reality (AR) promises to transform everything from design and manufacturing to surgery but it will never realise its potential unless fundamen...

Augmented reality (AR) promises to transform everything from design and manufacturing to surgery but it will never realise its potential unless fundamental performance and usability issues are addressed. According to a recent report from specialist AR/VR industry analysts Greenlight Insights, solving two fundamental optical issues would unlock an additional $10bn in spending on AR enterprise applications by 2026.

Currently, all AR experiences suffer from two major problems, vergence-accommodation conflict (VAC) and focal rivalry. These optical terms may be relatively unknown but anyone who has worn an AR headset for any length of time will recognise the effects: eye fatigue, inability to read text up close and struggles to complete precision tasks because real and virtual content is not well integrated. Greenlight’s analysts estimate that 95% of current AR applications would see an immediate benefit if these issues were solved.  

VAC breaks the natural way our eyes focus. Normally, as an object gets closer our eyes turn inwards to triangulate on it, stimulating the eyes to focus at the right distance. This doesn’t happen in AR (or VR for that matter) because lenses in headsets are set at a fixed focal distance. Our eyes are simply not fooled by clever software manipulation of virtual images. Systems need to be able to change the focal distance to place virtual objects accurately and convincingly in real space.

SEE ALSO:

The other major challenge, focal rivalry, occurs because our eyes can’t focus on real and virtual content together, unless they are in the same focal plane. Unless these issues are solved, virtual and real content will never be able to be accurately and comfortably integrated together and AR will struggle to be an all-day wearable tool for enterprises.   

Microsoft’s Hololens is one of the most advanced AR experiences available today but it is currently forced to advise all content developers to place virtual content beyond arm’s reach to avoid an uncomfortable experience. This limits our ability to engage, and work with content meaningfully.  There has to be a better way. 

The solutions lie beyond improvements in processing power, software and screen resolutions. Dynamic lens systems will make it possible to overcome these issues and place real and virtual objects together in an accurate and believable way.

By John Kennedy, CEO, Adlens

Share

Featured Articles

Advancing AI in Retail with Pick N Pay's Leon Van Niekerk

Pick N Pay's Head of Testing Leon Van Niekerk tells us at OpenText World Europe 2024 about its partnership with OpenText and how it plans to use AI

How Intel AI is Powering the 2024 Paris Olympic Games

Intel's AI technology is set to transform the Paris 2024 Olympic and Paralympic Games, enhancing experiences for athletes, spectators and global audiences

OpenText’s Muhi Majzoub: Engineering Platform Growth with AI

At OpenText World Europe 2024, we heard from EVP & Chief Product Officer Muhi Majzoub about OpenText’s latest product developments and future outlook

Top 100 Women 2024: Tanja Rueckert, Bosch - No. 6

Digital Transformation

Tech & AI LIVE London: One Month to Go

Digital Transformation

OpenText CEO Roundtable: The Future of Safe Enterprise AI

Digital Transformation