A Mid-Summer Night’s Dream

A Mid-Summer Night’s Dream

by Stephen P. Atwood

by Stephen P. Atwood

I woke this morning to the sound of my personal robotic assistant gently reminding me of the time.  It was almost 7:30 and I had to be at my first meeting by 9:00.  My assistant brought coffee to my room and replayed a brief summary of the overnight news headlines to let me know what was happening in the world.  “Excellent!”  I thought as he re-counted how many new medals the U.S. had picked up at the 2024 Olympic Games overnight.  After getting dressed, I went downstairs and turned on the 3D projector to watch more sports coverage while I enjoyed my breakfast.  I ruminated about how strange it must have been for people to watch television through little flat windows on the wall instead of having a holo-projection filling the entire room.

“Right on time” I thought as I checked my bio-feedback watch and moved from the kitchen to my studio office.  Sitting down in my chair, I turned on my computer and started the virtual conferencing application that would carry me to the meeting room at the company headquarters in Portland.  The projector in my office also came to life and an entire meeting room, including a conference-room table, appeared in front of me.  I watched as other members of my team popped into view at various seats around the table.  Thanks to the light-field camera array pointed at my chair, I also now appeared virtually in front of everyone else at the meeting.  We could all see each other in full size and depth and look around the entire room, making it almost indistinguishable from really being there.  But, in fact, we were all in different places including China, Europe, the U.K., and the U.S.

The meeting began with a holo-projection of the new product being assembled on the production line.  I could see every detail of how the assembler was building the product and what issues he encountered.  It was not going well enough and took too long to complete.  We went around the room brainstorming about how to make the assembly easier and increase production to the critical target the GM wanted us to hit.  After we collected the ideas, our animation artist used those ideas to re-build the original assembly sequence and then we could see exactly how the new assembly would flow based on our changes.  A few more bugs needed to be worked out and then the final sequence was sent instantly to the factory floor, where the assembly team was waiting to watch the new projection.  It took about 10 more minutes for all the assembly procedures to be electronically re-imaged and then we could see that we had solved the problem.

Just as the meeting was breaking up, my wife messaged to say that our son the gymnast was just starting his morning warm-ups.  He had a meet later that day that I was going to attend but I needed to work that morning.  So, she used her portable light-field camera to capture his routines and they appeared on my office holo-projector, replacing the conference room that was just there earlier.  As I finished my morning reports, I could keep an eye on him and even message back some “helpful” suggestions that made him wave back in embarrassment.

The weekend before, our daughter played her first soccer game and the entire game was recorded with light-field cameras.  After the game, the coach brought us all back to a holo-studio where we could replay all the critical moments of the game and literally walk onto the field in the projection and observe the plays from different angles.  She could see and interpret her own foot work, understanding much better how to improve her game.  We’re hoping for a championship season this year.

It was about noon, and I was getting ready to leave for the meet.  I thought I would take the self-driving car we had just bought so I could work some more during the long drive.  It was a little extravagant for our budget, but since they added the light-field array cameras to the technology package its been practically fool-proof and really much safer than my own driving anyway.  The combination of full situational awareness in these cars and the neural network deployed on all major U.S. highways had really transformed driving and vehicle safety.  The kids love it because the holoprojection turns the inside of the cabin into a virtual movie studio and they even forget they are riding in a car.

As I slid into the seat, I felt something wet on my cheek.  Wet?  Huh?  It’s not raining.  It feels like dog slobber … hey where am I?  Ugh, I’m in bed and the dog wants to go outside.  It’s 2016 and I do not have a robot, or a studio, or a holo-projector.  And I have a 60-mile commute in my human-driven car and I need to get going … Sigh.

OK, so my imagination might have gotten a bit ahead of the technology, but as you will see in our issue this month the light field is the place to be, and we really are heading for a radical transformation in how we capture, view, and interact with images in the real and virtual worlds.  As our hard-working guest editor for this month, Dr. Nikhil Balram, says in his guest editorial, “From the earliest days of history, humans have attempted to capture and display images ... .  What is striking about this long history of capture and display is that almost all of it is based on two-dimensional imagery.”  We have been trying to capture and render the world in three dimensions, the way we experience it in real life, but each attempt failed to gain mass adoption.  Well, finally we can start to make this dream real, using the light field the way our human visual system does.  It will not happen overnight, but given the recent advances in the field artfully described by Nikhil in his Frontline Technology article, “Light-Field Imaging and Display Systems,” maybe my dream of a life in 2024 might be a bit optimistic but the science and technology already being developed gives me a lot of hope.

Even without holographic projectors we can begin to imagine creating virtual worlds for a single observer using head-mounted displays.  In our second Frontline Technology feature on this topic, titled “Recent Advances in Head-Mounted Light-Field Displays for Virtual and Augmented Reality,” by Professor Hong Hua, we learn about the several methods beyond stereoscopic imaging that can be employed to create a true sense of depth and parallax for the observer.  With this and the earlier mentioned article by Dr. Balram, I hope this issue becomes a go-to reference for you on this topic for a long time to come.

Closely aligned with the topic of light fields is the world of augmented and virtual reality, which has been more the stuff of science fiction than fact.  But things are progressing rapidly, and that’s why SID chose to make it a special topic focus track at Display Week this year.  This AR/VR progress, along with other interesting information, is compiled for you in our “Display Week 2016 Show Daily Highlights” article this month.  Contributing authors Achin Bhowmik, Jyrki Kimmel, Steve Sechrist, and Ken Werner give you a taste of the highlights ahead of the full show-issue coverage coming next month in Information Display.

Ahead of Display Week this year, the Bay Area SID chapter held a one-day marathon technical conference for people looking to catch themselves up on important topics within the field.  Topics ranging from OLEDs, QLEDs, lasers, and e-Paper – all the way to black swans and microdisplays — filled the day with information, opinions, and creative ways of looking at our field.  It sounds like such a great event, and one that I am sorry I missed, so we sought inputs from Sri Peruvemba, Paul Semenza, and John Wager, who along with our own Jenny Donelan put together this innovative conference review titled “One Day, Sixteen Speakers, Innumerable Insights.”  Both the event itself and the way we tried to cover it are different than what you may be used to, so I hope you enjoy this article.

Ray Soneira is well-known in our industry and a great supporter of both display measurement technology and the things that can be done with it – as evidenced by the work he has done through his company DisplayMate Technologies Corporation.  We always appreciate Ray’s insight and balanced opinions.  He works hard to gather objective data on the performance of so many sizes, brands, and types of displays.  This month, we welcome Ray back to the lineup with a Frontline Technology article on the subject of color gamuts titled “Display Color Gamuts: NTSC to Rec.2020.”  In this article, Ray describes the history of color gamuts as they have been created and adopted, along with the role they play in producing high-performance display products.  It’s a tricky subject because almost anyone close to the field has an opinion and those opinions can differ significantly.  However, regardless of your point of view, I think we can all agree that the subject can be confusing, and it is rarely presented accurately and in the correct context by the commercial world.  So, we welcome Ray back with his thorough treatment of this topic and thank him for his efforts to help clear the fog a little.

Along these lines, I will make one editorial comment of my own.  Having been personally involved in developing color-matching programs for displays and dealing at length with the challenges of incorrect color rendering caused by the variations in display hardware, I would love to see the day when the entire concept of encoding RGB gray-level pixel values is set aside in favor of native methods such as those being proposed by Dolby and others.  If the display itself knows what color gamut it can render and the video content arrives with each pixel encoded by x, y or u’, v’ and absolute luminance, then we have a better chance of the display device being able to render the content as close to its original intent as possible – provided, of course, the gamut of the display is suitably large enough.  But regardless of what encoding scheme is used, color gamuts are fundamental to displays.  Understanding what they mean and how they are used is crucial to designing a suitable display information system.

And so with that I’ll sign off and wish you a great summer season.  Oh, and about that personal-assistant robot – the last time I saw him he was visiting San Francisco and we snapped a picture of him for the cover of our July 2015 issue.  After that he said he was going to tour the world and he has not come home yet.  If you see him, ask him to call or write – he has my credit card.  •