Environment & Engineering

MRAI: better light estimation for smartphone MR

MRAI Project 1

Mixing real and virtual worlds

Mixed Reality (MR) creates an environment that allows both physical and virtual objects to exist and interact in real-time. But in order for MR users to have the most believable experience, that environment must have lighting, shadows and reflections that are equally realistic.

Traditionally, this requires omni-directional data from a 360camera, or a pre-made High Dynamic Range (HDR) environment map—however, neither of these are readily available to the average consumers, who most likely use a smartphone or a head-mounted display (HMD) to create their MR experiences.

Because smartphones are not equipped with 360degree cameras, they are only able to capture a partial view of the phone user’s surroundings. This limited view does not provide enough information to identify any directional lights, or provide realistic maps for reflections—leading to a less immersive MR experience as a result.

To address this problem, researchers at Victoria University of Wellington have developed a novel method for taking the limited input of a phone or an HMD and inferring what the lighting should look like in as close to real-time as possible. This enables virtual objects to be composited into a real-life scene complete with realistic lighting, shadows, and reflections—so that content developers can blend the synthetic and real worlds seamlessly together for a fully immersive, interactive MR experience on a smartphone.

Features and benefits

Realistic light estimation

The MRAI algorithm evaluates partial-view frames or images to generate realistic reflection maps, and detect salient lights in virtually real-time.

Faster, better mapping

Current models attempt to generate a single environment map for reflective materials, and must be convolved to suit materials with higher roughness—an extra process which causes delays. The MRAI tool’s algorithms are progressively trained to simultaneously provide multiple reflection maps at all roughness levels, creating a superior quality MR experience.

More accurate, less intensive processing

Eliminating the extra step of convolution means less errors, and less intensive computer-processing.

Next steps

Wellington UniVentures is currently working with the project team to explore opportunities with mixed reality hardware and development tool businesses who may benefit from incorporating the technology into their software development kits.

The team is also seeking companies to test their light estimation method and discuss other MR innovations such as materials estimation.

 

For more information, please contact the Commercialisation Manager below.

Ela web profile
Ela Romanowska

Senior Commercialisation Manager

View Profile