OLED TVs could look quite different in a few years time compared to today. While OLED is generally considered the big TV tech of the moment, with its ‘infinite’ contrast ratio, color accuracy, and deep blacks, there’s no denying that it won’t stay top dog forever – at least, not without some further innovations beyond the capabilities of its picture.
Not long after we heard from Panasonic that the picture quality of its OLED sets had pretty much peaked, we got the opportunity to speak with the CEO and co-founder of OTI Lumionics, Michael Helander, about the technical advancements that could up-end the way we interact with OLED TVs in the future – including baked-in sensors for tracking our eyes, heads and hands.
First, some background: OTI Lumionics is a specialty materials company, one that works across the electronics industry to, as its CEO puts it, “help drive new user experiences” beyond the iterative improvements we usually see in TV makers’ annual product ranges.
It receives a large amount of investment from LG Display, which is the only manufacturer of OLED panels for TVs, and clearly has a big stake in driving innovation in the space.
“We’re really only focused on solutions to new user experiences, rather than, you know, helping to drive small incremental improvements to make your display slightly brighter, or slightly reduce power consumption,” says Helander. “That’s just not interesting from a user perspective.”
A whole new world
New TV ranges are often disappointing in the lack of big, substantive changes – in the 2021 LG TV range, for example, the major technical changes seem to be the introduction of a tripod TV stand and an OLED evo technology designed to make the Gallery Series OLED look marginally brighter. But what exactly can OTI or any of the major players in the TV space actually innovate in?
In Helander’s opinion, the most exciting development in OLED is “the integration of the display” with “different types of sensors and cameras.”
This is something we see far more commonly in the world of smartphones, where a need for embedded front-facing cameras and facial recognition sensors have to grapple with a desire for all-screen, bezel-banishing designs. With the exception of so-called ‘notch’ cameras, where else can these things go, if not under the screen itself?
Helanda explains that, “In mobile phones we’re starting to see the first generation of products with under-display cameras, directly integrated into the display and kind of hidden under active pixels. We’re anticipating some future products with under-display Face ID and other types of IR sensors too.”
But as we figure out how to do the same with OLED TVs, whole new opportunities for interfacing could open up. Helander tells us that “the ability to have a camera or built-in IR sensors that can track your eyes, your face, your hands… It opens up new possibilities for user interaction with the display and the content.”
It’s OTI’s developments in cathode patterning, in particular – the process by which negatively-charged electrodes are latticed behind an OLED panel to channel electricity towards it – that are enabling under-display sensors, allowing for “total design freedom” when deciding how this layer of cathodes is made and arranged, and what can be placed in the gaps between them.
Why does this matter?
It’s unlikely that this will have much of an immediate impact on your souped-up home cinema, though Helander suggests that head-tracking could lead to spatial audio functionality, “the way that Apple has done with the AirPod Pros and AirPod Max, by tracking the position of your head. You could do the same thing in a home cinema setting, at least for an individual watching TV.”
It’s possible, too, that this could have an impact in the world of gaming, with sensors able to track your eyes, head, or gestures for navigating an in-game environment – as with the Microsoft Kinect camera for Xbox 360.
Helander admits the main use case for a TV screen packed with sensors and cameras is probably in “an office setting, for video conferencing […] so that when you’re looking at someone on the screen, you’re actually looking at their eyes, they’re looking at your eyes, and it becomes a much more engaging experience.”
“When you think about the size of the TV,” says Helander, “that’s a really large area where you could put quite a lot of different sensors, even starting to get some depth mapping data, and other types of 3D scanning, just by having multiple sensors placed behind it this way.”
Of course, manufacturers will still have to pack those sensors in somewhere.
“With some of the designs you see from Sony, there’s still a large box of electronics mounted at the back of the panel, in the centre region, where you could accommodate a lot of those sensors, and still have the edge of the panel be very thin.
“Or, if you look at the ‘wallpaper-thin’ LG WX OLED, that may be a little bit more difficult to integrate certain types of sensors and optics – but of course people are getting better and better at squeezing down the space that all of these types of sensors need for smartphones, as there are limitations on just how much physical space you have in a very slim phone. So again I think a lot of the technology there is going to get adopted into the largest size panels.”
The exact uses of these sensors is unclear at the moment, and we expect it will be quite a few years before they come baked into the average home television. It’s obvious, though, that new ways of interfacing with our devices are emerging, and it’s only a matter of time before a mainstream product makes a strong case for them.