Analyzing Meta’s VR Optics Future

Virtual Reality (VR) headsets are edging toward becoming commonplace. However, social acceptability is still some way off and the user experience is not entirely comfortable. Essential to solving these problems are the lenses that magnify and focus headsets’ displays. Until recently, this field was relatively static, with headsets using optics based on Fresnel lenses, but a flurry of innovation is now in progress.

Phase 1: Compact Optics and the ‘Holocake’ Lens

Pancake lenses can shrink VR headsets by using polarizers, waveplates, and reflective surfaces to fold the optical path through two glass or plastic lens elements, reducing the distance between the display and lens. This also avoids some optical aberrations associated with the incumbent Fresnel lens technology.

Schematic of path of light through a pancake lens. Source: IDTechEx

Meta’s ‘Holocake’ lens takes the principles of pancake lenses and replaces each element with a holographically optical element (HOE) – a holographic recording of a lens – flattening the optics to the thickness of the holographic film. HOEs are a well-recognized technology in the closely related AR industry: their uses here are covered in detail in IDTechEx’s report, along with forecasts for technology uptake and associated photopolymer materials. The VR industry has paid little attention in comparison – Meta hinted at some of the potential reasons for this when discussing the Holocake prototype.

Holograms are recorded with laser light and need to be lit by laser to be properly viewable. The same is true of HOEs, and Meta’s prototype Holocake 2 headset uses a laser-backlit LCD to meet this requirement: this represents one factor keeping these optics a while away from commercial viability.

Furthermore, pancake/holocake setups have a low transmission efficiency (generally around 10%). This requires very bright displays and large batterieswith existing commercial pancake-lensed headsets from other manufacturers requiring tethered power – moving towards new display types, such as microLED, could help mitigate this issue.

Phase 2: Solving the vergence-accommodation conflict with varifocal lenses

“I don’t like VR – it makes me sick.” This is a common complaint about virtual reality headsets. According to Professor Thomas Stoffregen, who researches motion sickness at the University of Minnesota, 40 to 70% of VR users experience this unpleasant effect within just 15 minutes. One factor contributing to this is the mismatch between perceived and focal distance in stereoscopic 3D images, known as the vergence-accommodation conflict.

Eliminating this issue might just open a whole new segment of customers to purchase a new VR device. Meta’s Half Dome project has been working on varifocal lenses for VR since at least 2018. Varifocal lenses, coupled with eye tracking sensors and software, can solve the vergence-accommodation conflict by adjusting the lens’ focus depending on where in the virtual scene the user is looking. The Half Dome project’s latest iteration uses arrays of geometric phase (also known as Pancharatnam-Berry phase) lenses to achieve this goal with no moving parts, all in a package a few mm thick.

Mark Zuckerberg estimated that varifocal optics could find their way into commercial headsets from around 2027 onward, demonstrating that this technology is closer to market readiness than many previously assumed.

IDTechEx guides your strategic business decisions through its Research, Subscription and Consultancy products, helping you profit from emerging technologies. For more information, contact or visit External Link.


Leave a Reply

Your email address will not be published.