I got my first flat panel TV in 2006, a 37-inch Vizio. I set it up and installed it myself, which is pretty slick considering it took three guys to haul out the 95-lb., 32-inch JVC tube TV monster it replaced. I felt like I was on the bleeding edge of technology then, faced with tough first-world decisions like whether to watch The Office or Dexter in HD on cable, whether Netflix should deliver two or three DVDs to my mailbox in just two days flat, whether I should snap up Nacho Libre on HD-DVD or Blu-ray Disc.
It’s almost comical, isn’t it? We’ve come such a long way since then. Today, I internally debate streaming Game of Thrones or The Walking Dead, OLED vs. Quantum Dots, HDR10 vs. Dolby Vision, and whether to embrace Ultra HD Blu-ray or bank on Netflix and Amazon.
In another decade, we’ll be chuckling at how dated 2016’s state of the art was. Future us will wonder why we resisted VR for so long, how we didn’t see holographic TV coming, and why we resisted 4K with 8K already infiltrating our homes. But to see how we’ll get there, it’s essential to look back at where we’ve been and how we got here, a world where the entertainment wonderland we call TV is transforming around us – so thoroughly that we’re going to have to learn to call it something else entirely.
Bitter rivalry, brilliant innovation
The TV you buy in 2016 is far brighter, thinner, cheaper, and more colorful than anything you could have hoped to find 10 years ago, plus it will play a massive library of movies and full seasons of TV shows simply by connecting to the Internet. That started with some fierce competition — and the cojones to take a big step backward before forging forward.
South Korean rivals Samsung and LG weren’t always household names. Like Korean automakers Kia and Hyundai, Samsung and LG not only had to compete against Japanese titans like Sony and Panasonic, they also had to compete against each other for brand recognition and sales. And there’s no better public showcase for that battle than the CES consumer electronics show, held in Las Vegas each January.
LG and Samsung had something entirely new up their sleeve: OLED TV.
At the show, even to this day, it is not uncommon for those behind the scenes to watch both companies scrambling to assemble some top-secret, bleeding-edge TV hours before the show opens. The payoff is always worth the struggle, though; it’s not uncommon to walk by Samsung or LG’s display area and see hundreds of people at a time gathered around the latest innovation. And it all started by putting a superior TV technology to rest so they could concentrate on improving an inferior one.
In 2006, and for eight years after that, plasma televisions had visibly superior picture quality. Plasmas had far better black levels — the basic foundation of contrast and an essential element to picture quality. By comparison, LCD TVs looked gray and milky. But as important as picture quality is for a TV, it turns out other elements were more significant to TV buyers.
Plasma TVs were heavy, cumbersome, energy-hogging, heat-generating beasts; they couldn’t get very bright, and they struggled to perform well in daylight. They worked for videophiles, but at a time when the love affair with the flat panel was over and buyers wanted more futuristic-looking TVs, LCD televisions had far more potential.
Ten years ago, LCD televisions were already thinner and lighter and brighter than plasmas, but recent breakthroughs in LED technology could take LCD TVs to the next level. Plasma was eventually laid to rest as brands pushed to make a better LCD TV.
What took LCD TV to the next level was the LED. These tiny, bright light-emitting diodes replaced the compact fluorescent bulbs used as backlights in televisions fairly quickly, and by 2010 every TV was an “LED TV.” Even though these were still just LCD TVs, albeit more uniformly bright ones, the marketing wizards used this to their advantage to generate excitement about the new breed.
OLED TVs are impossibly thin, brilliantly colorful, and simply mesmerizing to behold.
With LEDs now the de facto standard in TV technology, manufacturers felt pressure from reviewers and videophiles to abandon the thin-is-in fad, and get back to picture quality. Every notable name in the industry went to work to make their premium LED TVs mimic the look and feel of plasma, and some would see success. But LG and Samsung had something entirely new up their sleeve: OLED TV.
Sticking an “O” in front of LED sounds like yet more marketing hype, but in fact it stands for something very significant. Organic Light Emitting Diodes are nothing like their conventional inorganic brethren. Think of them as tiny cells filled with organic compounds that glow when electricity is applied. These cells are so small that they can function as pixels all on their own, and because they stop glowing when turned off, they go completely, utterly black. And since they brighten on their own, OLED TVs don’t need a backlight at all. They are impossibly thin, brilliantly colorful, and simply mesmerizing to behold. Many reviewers have joined me in calling them the best TVs ever made.
Unfortunately, OLED TVs are hard to make, which is why Samsung and Sony decided to bail out of manufacturing them indefinitely. Meanwhile, LG pressed forward with its own technology and since then it’s been LG and its OLED TVs against everyone else and their increasingly brilliant LED/LCD TVs. It’s not exactly a fair fight.
The fight for smarter TV
When Netflix launched its streaming service in 2007, it became apparent TVs would have to get smarter. They would need to act more like computers and less like dumb terminals. The race was on to create a smarter TV capable of running apps just as smartphones and tablets could. Today, we have an unlimited number of movies and TV shows at our fingertips — no disc, cable, or satellite required, just an internet connection.
“As more and more consumers turned to secondary devices like tablets to view content, we knew we needed to create a similar experience on the TVs,” says Dave Das, senior vice president of Home Entertainment at Samsung. “Consumers habits have changed, and it quickly became important to make it easier for them to share content across multiple devices to seamlessly go from a personal viewing experience on their tablet to a shared experience on the TV, for example, or to start content from one device and continue watching on another.”
“You can even talk to your TV today instead of pecking out characters on a virtual keyboard.”
“Samsung was the first to offer Netflix via our Blu-ray players in 2008, but to us, Smart TV offered so much more potential. That same year, we were the first to offer a service called InfoLink, which was a RSS service delivering weather, news and stock quotations,” Das said.
Thanks to all that computing power, TVs are getting easier to use, sometimes recognizing components connected to them automatically, with the ability to control them all with just one remote. They can even make recommendations on what to watch based on viewing habits, and provide search results based on actors, genres, and time periods. You can even talk to your TV today instead of pecking out characters on a virtual keyboard.
While the TV industry appeared to be focused on incremental hardware and software improvements, it secretly had an entirely new TV format simmering on the backburner, and in November 2012, it was served up to U.S. consumers on a massive platter: 4K TV.
4K Ultra HD
4K Ultra HD television was initially defined as having four times the resolution – or pixel density – of 1080p HD. Imagine that for any given screen size, where there was just one pixel, there were now four. It sounds like a quantum leap forward, and it sure looked that way at first. It was also an easy way to move products, explains Geoff Morrison, a well-respected reviewer, tech writer, and book author.
“Resolution, i.e., more pixels, was always the easiest route for TV manufacturers. Not only was it easier than, say, greater contrast or better color, but it was an easy sell marketing-wise. 4K is greater than 1080p, and therefore better! Sold,” Morrison told Digital Trends.
LG was the first to put a
It was all very exciting at the time. TVs this large had never looked so good, in part because it’s easy to see pixels at that scale, but also because such pristine 4K footage had never been shown to the public on a commercial TV before, let alone one so large. Eventually, though, the magic would wear off. The smaller the screen got, the less
HDR and WCG represent a significant leap forward for 4K Ultra HD’s picture quality.
“I was one of the most vocal 4K TV critics, a pariah to a degree, blacklisted from reviews and lambasted online,” Morrison said. “The frustrating thing is my position was usually misquoted. I never said
Beyond that was the standards issue. Much was still left to figure out about 4K, but in their haste to offer something new, be the first, and do it bigger and better than the other guy, TV manufacturers kicked out
The Ultra HD Alliance, a consortium of manufacturers, content creators, and other industry professionals, has created a standard for premium
Wide Color Gamut is a little less obvious, but it still has a big impact. Until now, TVs have essentially been painting with a box of crayons, one of those small, 16-count packs. Modern Ultra HD is working with one of those megapacks, with more shades than you know what to do with. The number of colors grew by millions, now reaching more than 1 billion hues — colors that simply weren’t possible before.
Why wasn’t all this included with the first 4K TVs? The technology just wasn’t there yet. OLED TVs were ready, but LED/LCD TVs weren’t. Not until quantum dots came along.
How quantum dots work is a yearlong science lesson of its own – unless you read our primer here. Suffice it to say that quantum dots help LED/LCD TVs do a more efficient job, enabling them to produce much brighter images, with much more color volume as well. Never before have LED/LCD TVs been able to compete so closely with OLED TVs.
TV hardware technologies – what makes up a TV set itself – have a number of interesting paths to follow. But the truth is, what we’re used to calling television may change so much that we can’t really call it TV at all. That’s because, increasingly, people aren’t watching TV on a television.
The TV of the future: VR?
Virtual Reality (VR) isn’t just for traveling to exotic locations or shooting aliens in space, it could be the next frontier for watching movies, episodes of your favorite shows, and other kinds of video.
Netflix and Hulu each launched VR apps to coincide with the Samsung Gear VR headset, the Oculus store has a number of movies to watch in VR, and when Google introduced its Daydream VR headset recently, the company made it clear that enjoying video content from the Google Play Store, YouTube, and Netflix was a big part of its appeal. And while Sony hasn’t announced any official plans, it’s not a stretch to imagine that the recently released PlayStation VR could be made to work with Sony’s PlayStation Vue online TV service.
“We realized you could do storytelling, you could do longer form, you could do something simple. You could cut every eight seconds, like you do on television,” Jeff Marsillo, the NBA’s vice president of Global Media Distribution, told Digital Trends last month.
AR seems less likely a replacement for TV than it does an enhancement.
Sure, it’s hard to imagine that VR could ever replace the TV entirely — there’s a social aspect to watching TV that VR wants, and hasn’t quite gotten yet. But the immersive feeling afforded by VR can’t be discounted, and with so many younger viewers taking to small, personal screens, rather than televisions, VR seems like a natural next step.
The challenges to broadly implementing VR for video are insignificant … but doing it well is an entirely different matter. TVs use advanced processing to create beautiful images with realistic color and natural motion. That hardware can’t be crammed into a pair of VR goggles yet. TVs still have trouble displaying 24 frame per second (fps) film content smoothly – an anomaly called judder is introduced in the process of converting a 24 fps source to work on a machine that runs at 30 fps, which makes the image appear to stutter along. This is already noticeable on a TV, and it’s even more easily discernible when watching a VR image displayed just in front of your eyes. This may not pose a problem to the masses, though; people have shown they will choose convenience over quality more often than not. Just look at MP3s.
If VR aims to whisk you away to another place, augmented reality (AR) intends to keep you right where you are, and enhance things with overlaid graphics and information. While VR might have you watching a movie in a simulated theater environment, AR would let you watch as though the video were playing on any surface you wanted.
Let’s say you were trying to install a video card in a PC and needed a little help – AR would allow you to watch a how-to video while looking right at the innards of your computer. In other words, the video plays on top of your reality rather than replacing it.
AR seems less likely a replacement for TV than it does an enhancement. One could use AR not just for watching video content but also for learning more about it. In the same way that the music app Shazam is able to recognize a song by analyzing a brief snippet of audio, AR apps could be developed to recognize a TV show or movie from just a quick clip and display information on the actors, directors, and even shooting locations. AR could also further enhance TV watching by allowing viewers to participate in live social media conversations about a season premiere episode, for example.
“We’re excited for the potential of mixed reality to enhance our lives in various ways, including the way in which we watch television,” said Craig Cincotta, director of communications, Mixed Reality at Microsoft. The company doesn’t have specific plans at present, but the promise these platforms hold is clear, he noted. “We’re enthusiastic about the limitless possibilities for HoloLens.”
While it’s doubtful AR would ever replace TV entirely, it certainly has plenty of potential to enhance the TV-watching experience by making it more interactive.
Holographic TV
Science fiction movies and television have positioned holograms as the ultimate in futuristic imaging for decades. Who can forget R2-D2’s iconic projection of Princess Leia in 1977’s Star Wars: Episode IV – A New Hope, or the scene when a giant hologram of the great white shark in the fictional Jaws 19 dives down on Marty McFly in Back to the Future Part II? Unfortunately, the state of the art in holography is a long way from such decades-old movie magic.
Perhaps the most widely used technology that best mimics holography is the use of mist projection, also known as water projection or fog projection. This approach uses a high-output conventional projector that reflects against a fine mist or fog instead of an opaque screen. You’ll see the technology at work in the Pirates of the Caribbean ride at Disneyland, and it was used in 2011 as a stunt to help launch Nike’s Jordan Melo M8 shoes.
Other projection technologies, like the one that famously brought Tupac Shakur back to life for the 2012 Coachella music festival use carefully placed mirrors and mylar screens to pull off their illusions.
But neither of these applications fit the technical definition of a hologram, nor are they well suited for watching television. A true hologram is created using a light field, not a lens, and is a 360-degree image, not a flat one.
Holographics may be fine for amusement park rides, but they’re not good enough to replace TV.
The technology that is perhaps closest to creating a true hologram is called volumetric display. These displays are said to be autostereoscopic, which means they appear 3D to the naked eye – no glasses required. The most promising example of a volumetric display was recently revealed to the public via a Kickstarter campaign. The Holovect is described as a holographic vector display, and it creates images in air using lasers and a little box that disrupts the air in just such a way that laser light is refracted, reflected, and diffused to create a 3D image.
The BBC recently experimented with holographics of a different sort; the network used an acrylic pyramid to create the illusion of holography. Sure, it was rough, but it may be the future.
“Our experiment was fairly simplistic, but the new technologies on the horizon have the potential to completely change the way that audiences experience media content in the future,” wrote Cyrus Saihan, head of Digital Partnerships. “Imagine a world where instead of watching a film star being interviewed on the sofa of a TV chat show, they feel as if they are sitting right next to you on your own sofa in your living room, or where instead of looking at a 2D image of Mount Everest, it appears as if the snow on the mountain top is falling around you.”
These are all very exciting technologies, but could any of them replace conventional TV any time soon? That’s highly unlikely. In most cases, air disturbance is involved in the creation of these images, and that results in shimmering, shaking images. That may be fine for basic shapes and structures, amusement park rides, and engaging marketing applications, but it’s not good enough to replace TV – certainly not in the next 10 years.
The TV remains – and changes completely?
If we accept the notion that there’s no imminent threat to the conventional TV, then where will televisions go in the future? In some cases, we’ll see evolutions of existing technology, but in others, we’ll be looking at entirely new display types.
If the glasses we had to wear to get 3D were removed from the equation, could 3D stand a chance at gaining traction in the home? If the folks at Stream TV networks have anything to say about it, the answer could be yes.
For the past six years, we’ve watched the company and a handful of others make gradual improvements to glasses-free 3D TV. Until just last year, Stream TV’s tech was dismissed by many as “gimmicky,” but at CES 2016 earlier this year, the company’s Ultra-D TV made its debut, and onlookers were impressed.
“In China, movies aren’t even released in 2D anymore — everything is 3D.”
Ultra-D uses a combination of light fields, a parallax barrier, and software to create a pretty convincing glasses-free 3D picture. It can be added to any TV at the manufacturing level, and works at 4K resolution. The technology promises not only to decode 3D movies made for TVs that require glasses but also to process 2D images into 3D. Perhaps Ultra-D’s greatest trick is that the 3D effect holds up well as you move through different viewing locations – earlier forms of glasses-free 3D worked for only one user at a time.
“The fact is that people are spending money to watch 3D movies. In China, movies aren’t even released in 2D anymore — everything is 3D,” Stream TV Networks CEO Mathu Rajan told Digital Trends. “If you look at the top all-time movie releases at Box Office Mojo, the vast majority were either made in or converted to 3D. Until now, the barrier to successful 3D in the home has been having to wear glasses — which no one wants to do — and now with Ultra-D, they don’t have to.”
Glasses-free 3D eliminates the potential for nausea/dizziness, works for anyone who has vision correction (i.e. glasses or contacts), doesn’t significantly dim an image, and is fairly effective at delivering a realistic 3D effect. The technology has certainly jumped a lot of hurdles, but do viewers want this kind of experience all the time? The answer is likely years away, and with 8K on the way, it may be a moot discussion anyway.
8K TV
With
The Japan Broadcast Corporation, better known as NHK, has been working on 4K and 8K broadcasting in tandem. The company has already conducted test broadcasts, including the opening ceremony for the 2016 Olympics in Rio de Janeiro, Brazil, and says it will begin public broadcasting of both formats in 2018 so it can be popularized in time for the 2020 Tokyo Olympic Games.
Japan has always been a little ahead of the curve, but TV manufacturers are already gearing up for the next generation with 8K TV prototypes. Sharp has shown an 8K television at CES for four years running, and began selling a consumer 8K model with a whopping 7,680 x 4,320 pixel resolution for $133,000 in October of last year. Meanwhile, Samsung, LG, and many brands that are not as well-known in the U.S. market have shown off their own 8K prototypes.
I’ve spent hours carefully examining several 8K displays, and while it is true resolution that fine will be appreciated mostly in larger displays of 85 inches and up, the effect it has on an image’s realism is substantial. Time and time again, I’ve heard and spoken to onlookers who all made the same comment: It looks so real, it’s almost like 3D.
The transition to 8K might not be as challenging as we’d think.
I wager that it’s better than 3D. There’s a certain kind of immersive experience that takes place when being overwhelmed by the sheer scale of a 98-inch diagonal display playing uncompressed 8K content. It’s almost as if you are in the picture.
With double the resolution of
Satellite companies like DirecTV and Dish Network are in a slightly better position, though. Both have satellites handling live 4K broadcasts now, though programming remains limited. But Dish Network tells Digital Trends the transition to 8K might not be as challenging as we’d think.
QLED TV
If 8K is the future of TV resolution, then what’s in store for the display tech itself? We’re pretty familiar with LCD and OLED now. What’s next, if anything? Well, if Samsung has anything to say about it – and as the world’s number one LCD TV manufacturer for 8 years running, we’d say it does – the next big thing in TV will be called QLED, and it might just knock OLED down a few notches.
The ”Q” in QLED is short for quantum dots, and while they’re already used in Samsung’s SUHD displays, QLED will use them in an entirely different way.
Quantum dots are tiny particles, which glow when you shine light on them. Presently, quantum dots are used to make existing LED/LCD TVs work better by improving their backlight systems. You see, LEDs aren’t very good at producing white light, which an LCD TV’s color filters need to make accurate, bright colors. So a sheet of film layered with quantum dots is used to transform the yellowish white LED backlights in a TV into a purer form of white light. Voila! The color filters can now parse out red, green, and blue more easily, resulting in a brighter, more accurate picture.
QLED TVs will ditch the entire color filtration layer. Rather than go through the inefficient process of carving out red, blue and green from white light, QLED TVs will replace color filters with pixel-sized stacks of quantum dots that will glow red, green, and blue when you shine a blue LED on them. An LCD panel will still be in place to act like a shutter, essentially blocking out light to create blacks. But the blue LED backlights at work will be much harder to detect by the human eye, so there will be less halos around bright objects on dark backgrounds, less bleeding of light from the edges, and better uniformity across the screen. In addition, because so much energy will no longer be lost to color filters, these TVs will be able to get brighter than any other TV technology we’ve seen yet. We’re also told that once the manufacturing processes are in place, scaling up production will be relatively easy, and that means these new QLED TVs will get cheaper, faster.
That sounds like big competition for OLED. Which brings us full circle: So long as there is a healthy spirit of competition among TV manufacturers, movie studios, content creators, and