How Light Field Makes Holograms Happen

David Fattal is the co-founder and CEO of Leia Inc., which came out of stealth in 2018 with a deal with RED’s Hydrogen phone to use its amazing display technology. Automotive giant Continental made a deal to create naked eye holograms in cars. The company was spun out of HP Labs, where Fattal discovered the application and developed it with other scientists, including co-founder and Leia Inc. CTO Zhen Peng. In this interview with Charlie Fink, Fattal explains how it works. For more about Leia, Inc.. click here.

Image for post
Image for post
David Fattal, co-founder and CEO of Leia Inc. Fattal was named Innovator of the year 2013 by the MIT Technology Review. LEIA, INC.

CHARLIE FINK: Let’s start with the amazing foundational myth about your company.

DAVID FATTAL: The full story is that we were working on a project called Optical Interconnects which uses nanostructures and manipulation of light on a chip, a wafer, to communicate information instead of electricity.

We were caught in a fire drill and had to leave the lab with everything we had in our hands. We all gathered in the parking lot.

It was a bright, sunny day. The sun was acting pretty much like a laser beam, with very directional light. As it heated the surface of the wafers we saw all kinds of cool patterns emerge, which was due to the directionality of the structures. At first, we didn’t even notice what was happening until the people around us were like hey, that’s super cool, what do you have in your hand? So that’s how it all started.

From then on, we spent all our so-called “20 percent time” devoted to that project, and more and more people wanted in.

We were able to project this light field, or this hologram, from a completely transparent piece of material. So it looks magical, right? You turn it off, it’s just transparent. You turn it on, and the hologram pops up all by itself.

Image for post
Image for post
How Leia works. Liea, Inc.

CHARLIE FINK: What are the fields of view that it has at the present time?

DAVID FATTAL: The field of view is entirely configurable. I want you to imagine a forest of light rays, and we can control exactly where each beam or ray comes from — which direction, and from which kind of angular width. We have full control over these parameters. So we can make a very, very narrow field of view, for privacy, or you can make a very wide field of view, and anything in between.

CHARLIE FINK: The demo of Leia I saw was the Disney movie, Coco, which is a 2D movie, although it was made using 3D techniques, playing on a tablet, which is I assume is a reference design for an OEM. When I saw the video on your display, even though it was a 2D movie, it appeared to be a 3D movie. Does Leia do that to all 2D content, or was it specific to a product like Coco, because it was made with 3D, even though its presentation format is typically 2D?

DAVID FATTAL: So that’s the question of the data format. Before I answer that question, we have to understand what the Lightfield display is and what is coming out of the screen. Then, I can tell you why you see things in 3D. So far, we talked about light fields capture, and that’s what popularized Lightfield as a medium. But obviously it’s very asymmetric, and you know, once you’ve captured your light fields, you want to be able to render them, right?

Imagine you have this very fancy camera that captures light rays from different directions on different pixels. Up to very recently, you didn’t have the opportunity to actually re-render these light rays. Now, with Lightfield technology, you can have a display that is able to, from a given pixel, give you different colors and different intensities of light from different directions of space. In a normal display, one pixel is going to emit the same information for all viewers so they’re seeing the same content at a given pixel. Now, the Lightfield display is going to break that down into different zones, sending you different colors and information from different angles. That’s really the color part of the light field capture, and now you can even re-render the light field from a flat surface.

This can best be explained by imagining you have a window in front of you. Now imagine that, similar to a camera, the window will capture all of the light rays coming in from outside. If the window was able to re-emit these light rays, that would be a Lightfield display.

Image for post
Image for post
The image that started it all. Leia, Inc.

CHARLIE FINK: So the re-imaging is what makes it seem three-dimensional?

DAVID FATTAL: Yes, that’s correct.

CHARLIE FINK: Is that because the colors create different lengths of light?

DAVID FATTAL: Yes. Essentially, what you experience now when you see a Lightfield display is your two eyes picking up different information. They will be picking up the correct information as if it was coming from the real world. Since the real world is in 3D, you will also pick up depth. You’ll pick up the subtle variations of the lighting that make a texture look like a texture, that makes metal look like metal, that makes skin look like skin, that makes diamonds look sparkly, and so on and so forth.

CHARLIE FINK: Would it work for any 2D video from old black and white films to modern movies?

DAVID FATTAL: Yes. The movie Coco that you saw was actually already filmed in 3D stereo, side by side, so it has two views. Using our software at Leia, you can actually hallucinate the missing point of view. It does this by reconstructing the light field from very sparse kind of information and recreating the missing point of view to the best of its ability.

Aside from movies, we are also able to do this with simple 2D pictures. During the demo, we also showed you Holopix, which is our picture-sharing app. On the platform, a lot of people are uploading 3D stereo content to post publicly to their followers. We use that content to train a very big neural network. Thanks to that content and the ability of our neural network to digitally synthesize different light rays, today, we’re at the point where we have an excellent technology that can create holographic content for most 2D pictures on a Lightfield display, whether that’s a portrait shot, a landscape, food or otherwise.

Once you can convert 2D pictures, the next step is to convert them fast enough that you can actually create Lightfield content from any type of 2D file format, such as 2D video. Right now, that’s what we’re working on — taking any kind of 2D conten and converting it to light field content on the fly.

CHARLIE FINK: That is just so amazing, and this story is so mind-blowing.

DAVID FATTAL: It’s a lot of fun for the team, I can tell you that.

CHARLIE FINK: Once or twice a year somebody comes to me with something that is both real and mind-blowing. Thank you.

Originally published at https://www.forbes.com.

Written by

AR/VR Consultant, Columnist, Author of the AR-enabled books “Metaverse, A Guide to VR & AR” (2018) & “Convergence” (2019). http://forbes.com/sites/charliefink

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store