Skip to main content

Kids born today won’t know what a pixel is, and that’s a dream come true

death of the pixel kill all pixels imac retina
Image used with permission by copyright holder
The pixel is a cultural icon. Wander down any street and you’re likely to see it not only in old LCD screens lining storefronts but also logos, advertisements, and even fashion. “Pixel art,” the intentional use of low-fi, pixelated graphics, is virtually the default among indie games, and even Digital Trend’s own 404 page features 8-bit characters on a pixelated sky-scape.

Users have become accustomed to the pixel, and most have forgotten it’s an artifact of limited graphics technology. The pixel was never desired; it exists only because of the particular limitations of computer displays and graphics renderers. Over the past three decades, countless minds have tried to tame unsightly pixilation, also known as aliasing, in numerous ways.

Recommended Videos

The fight has been long, but the forces of progress are winning. Pixels are dying. It’s entirely possible that a baby born today will never see one, and a child born a decade from now will probably never know of their existence unless he or she decides to learn computer science. Let’s take a moment to reflect on the pixel before it’s laid to rest in the graveyard of obsolescence.

The jagged edge

Pixels have existed from the beginning of computer graphics, and for many early computer scientists, they represented serious problem. While their existence didn’t necessarily hobble interface design for early mainframes and the first home PCs, they presented a major problem for anyone seeking to push the limits of realistic computer graphics.

LucasFilm was an early pioneer in the field. Its computer division, which was eventually sold to Steve Jobs and re-named Pixar, searched desperately for ways to render graphics detailed enough to be used alongside miniatures in Star Wars.

What’s in one pixel could be a city.

Robert Cook, Pixar’s former Vice President of Software Development, was there from the beginning, and remembers the challenge well. “The basic problem,” he explained “is you’re trying to make an image with a lot of detail, and you only have so many pixels.”

This inevitably forces computers to make a difficult decision. Multiple objects might inhabit the space of a single pixel, yet only one can be shown – so which one should it be? “What’s in that one pixel could be a city,” said Cook, “but the computer has to pick one color.” Early computers, with limited pixels and no way to combat aliasing, were forced to dramatically simplify images. The result was coarse, jagged graphics that looked nothing like reality.

Star Wars a New Hope
Foiled by the pixilation of computer-generated graphics, Star Wars’ producers turned to real-life miniatures instead, like this recreation of the Death Star. Starwars.com

Those “jaggies” were particularly nasty in objects oriented at a diagonal to the pixel grid, and they precluded the use of computer graphics for most special effects until the problem was solved.

That proved a long, difficult road. Computer graphics never contributed significantly to the original Star Wars trilogy, which relied on a complicated dance of miniatures up to the Return of the Jedi‘s epic final battle. LucasFilm, refocusing on its core entertainment business and unhappy with the results of the Computer Division, sold it to Steve Jobs in 1986, who renamed the company to Pixar after its star product, a $135,000 anvil of processing power called the Pixar Image Computer.

A new hope

While the Pixar Image Computer was technically stunning, it wasn’t a commercial success, and it didn’t represent the company’s passion. Many of its employees wanted to use computer graphics to create entertainment, even art. This included former Disney animator John Lasseter, who was hired by the Lucasfilm Computer Division to bring life into its technically stunning graphics.

PC users expect razor-sharp image quality and despise softness, even if aliasing is the result.

Everyone knew, though, that even an animator of Lasseter’s skill couldn’t produce a compelling scene from computer graphics if jaggies remained an issue. Pixels don’t appear natural, they obscure the detail of a scene, and in motion they transition perfectly from one pixel to the next, removing the motion blur that makes film seem realistic.

The geeks at LucasFilm tried to tackle the problem in a number of ways. Eventually a hardware engineer at the company, Rodney Stock, came up with an idea, which Rob Cook refined into a fix for aliasing. Randomness.

“The jaggies come from the samples all being lined up on a grid,” Cook explained, “if you add some randomness, you break up the patterns.” Adding randomness to aliased portions of an image introductions uneven noise that, unlike patterns of perfectly stepped pixels, doesn’t seem unusual to the human eye.

Rob Cook (2010, Photo by Deborah Coleman / Pixar)
Rob Cook Deborah Coleman/Pixar

Randomness did more than just serve as effective anti-aliasing. It also helped blend computer effects with film and created a blur effect when applied to multiple frames of motion, addressing numerous problems with one tidy solution. While there were alternative techniques for aliasing in 3D film, they proved too computationally intense and didn’t produce superior results, leaving random sampling to reign as king.

Bringing it home

Solving the problem of pixels on the average home PC is not the same as solving it in film, however. Computer-generated movies are expected to replicate the nature of film, including its imperfections. A little noise or blur is not just acceptable, but desirable.

The Windows desktop, and computer interfaces in general, are a different animal. Users expect razor-sharp image quality, despise softness and frown upon noise. The ideal font is pixel-perfect, high-contrast, fine yet readable, standing out boldly from its surroundings. Even computer gamers expect a very specific experience and often look down on motion blur as an artifact or distracting visual add-on rather than a desirable effect. Games that pay homage to film, such as the original Mass Effect (which implemented a “film grain” filter), catch flack from those who prefer a sharper experience, even at the cost of aliasing. Pixels are preferable to noise.

Mass Effect
Mass Effect Image used with permission by copyright holder

Given the choice, though, users prefer to have the best of both words; razor-sharp image quality and smooth edges. A number of techniques have been developed to deliver this, with varying success. Windows uses Clear Type, a sub-pixel aliasing technology designed specifically for fonts. Apple uses font smoothing along with tight guidelines for the standards of art assets used by developers, particularly with its Retina displays. And games use numerous tactics, from multi-sample anti-aliasing, which only smooths the edges of polygons, to temporal anti-aliasing, which smooths all aliased edges while drawing on data from multiple frames.

These efforts have gradually eroded the pixel’s prominence, making unsightly, jagged edges less common, but they’re not a complete solution. Aliasing is a tough problem to solve, particularly when compute power is limited, as it so often is with home PCs. And there’s always a trade-off between sharpness and smoothness. Turning Apple’s text smoothing up a few notches in the command line can make aliasing very difficult to detect, but it also results in soft, fuzzy fonts that aren’t at all like the crisply printed text in a book.

The visual limit

Anti-aliasing was not the only solution to jaggies considered in the early days of computer graphics. Researchers also looked into rendering images with resolutions as high as 8,000 pixels on a side, which made individual pixels too small for the human eye to detect. Lucasfilm itself commissioned several high-resolution renders of its X-Wing fighter by the graphics group of a company called Information International, Inc. One of these highly impressive renders found itself on the cover of Computer magazine.

Yet this technique was soon abandoned for a number of reasons. It was insanely computationally intense, which meant a single frame effectively cost thousands dollars, and increasing the resolution did nothing to solve the motion blur issue that plagued computer graphics. Though effectively lacking visual pixels, the render didn’t look real and for Lucasfilm, deep in the production of Star Wars, that was an unforgivable sin.

Upscaling low-resolution content is an issue that’ll persist for years.

The failure of early high-resolution renders obscured the usefulness of high pixel counts for decades, but the past five years have brought resolution back to the spotlight. Apple’s first iPhone with Retina sparked the trend, and it’s quickly spread to other devices – for good reason.

Tom Peterson, and Director of Technical Marketing at Nvidia, told us that packing extra pixels really does render them invisible. “As the pixel density gets really high, it reaches the threshold of what the human eye can observe. We call that the visual limit. ” A display that exceeds the visual limit looks less like a display and more like a printed page, albeit one that glows.

What is the visual limit? It’s best described in terms of pixels per degree of vision, a metric that changes based on the size of a display and the observer’s distance. The golden number is 50 PPD, a figure that many modern smartphones easily exceed. 4K monitors don’t quite meet that goal, but 5K displays like the iMac with Retina and Dell UP2715K do, and a 65-inch 4K television can also hit the magic number if viewed from six feet away.

This is not to say that reaching the visual limit immediately eliminates aliasing. Upscaling low-resolution content is an issue that’s likely to persist until pixel-dense displays become the norm. Windows is currently struggling to curtail this issue because it must continue compatibility with numerous applications, some of which may be over a decade old and are no longer actively supported by their developers.

“Applications have some work to do using rendered fonts,” Tom Peterson explained, “because a lot of them are using bitmapped fonts.” They scale poorly, as they are “pixelated text images.” Ideally, fonts should be vector-based, making them a collection of lines and angles that scan scale easily. This is why the text in Windows 8.1’s interface looks brilliant at any resolution, but the text in desk applications often appeared soft and blurred.

Still, this is a solvable problem, and one that developers will be pressured to fix as pixel densities continue to surge. Users who spend hard money on an upgraded display will want to see the benefits, and are sure to avoid software that refuses to modernize.

We’re here today to mourn the pixel

Many new devices have already exceeded the visual limit, and while computers have lagged mobile devices, they’re beginning to catch up. Any 15-inch laptop with a 4K display easily exceeds the limitations of the human eye, and any 27-inch set with 5K resolution does the same. These panels will only decrease in price over time, just as did 1080p; within a few years they’ll be in everyday notebooks and monitors anyone can afford.

That will be the final nail in the pixel’s coffin. With televisions already heading to 4K (and beyond) there will no longer be any device on store shelves without the density needed to render pixels invisible at a typical viewing distance. Resolution itself will start to lose its meaning; users will simply be concerned with whether a display does, or doesn’t, appear as sharp as a photograph. Pixels will fade out of popular knowledge, and further advancements in sharpness will exist only for marketing. Programmers, artists and others who deal with digital images will continue to acknowledge pixels, but most users, even enthusiasts and videophiles, will have little reason to care.

Just as today’s youth can’t remember a world without the Internet, children born five years from now won’t remember a world in which the pixel exists. To them, displays will have always appeared as crisp as a window, and pixel art will be nostalgia for an era only their parents remember.
Their world will not be so different from our own. But it’ll look a hell of a lot smoother.

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
New Atari 50 DLC shows the Intellivision acquisition is already paying off
An Atari 2600+ sits on a table.

Digital Eclipse's Atari 50: The Anniversary Celebration is an excellent and comprehensive look back at the company's now classic video game lineup, with games to play and extra content to interact with. So far, it's gotten one DLC: The Wider World of Atari, that added even more titles. Now, it's about to get its second, thanks to an acquisition it made earlier this year.

Atari announced The First Console War on Friday, and it's about, as you can guess, the company's first console war with the Intellivision, although it'll touch on a specific element of it. In the 1980s, Mattel was publishing games on the Intellivision. At some point, it decided to release versions of these console exclusives for its main competitor, the Atari 2600, under the M Network label. There are 19 of these games coming to Atari 50 with The First Console War, which is set to launch on November 8 for PC, Nintendo Switch, Xbox Series X/S, Xbox One, PlayStation 5, and PlayStation 4 Atari 50 owners.

Read more
Is Monster Hunter Wilds cross-platform?
Two hunters ride mounts in Monster Hunter Wilds.

Of all the genres that we think need to be on the list of cross-platform games, hunting games like Monster Hunter Wilds have to be near the top. These are a rather unique style of game compared to the likes of Fallout 76, Genshin Impact, or Stardew Valley. Each of those has cooperative or competitive elements to them that are enhanced by cross-platform support, but nothing like what Monster Hunter Wilds has going on. The game will allow you to call in NPCs to help you on the hunt, but these games are best when you get a group of real friends together and embark on an epic quest to slay a giant beast, scavenge it for parts, and return to camp victorious. Monster Hunter Rise eventually got cross-platform support once it was ported off the Switch, but will Monster Hunter Wilds launch with this feature? Here's what you need to know about cross-platform support in one of our most anticipated upcoming games.
Is Monster Hunter Wilds cross-platform?

Monster Hunter Wilds will only have partial cross-platform support. The good news is that the part that it will have is crossplay, meaning that you and friends on either PS5, Xbox Series X/S, or PC can all hunt together with no issues. This feature can be disabled if you wish, but will be enabled by default to make sure desperate hunters can always find some aid when in need.

Read more
Fortnite Chapter 2 Remix teams up with Snoop Dogg, Eminem, and more
Snoop Dogg in Fortnite.

Fortnite’s new season, Chapter 2 Remix, is getting a star-studded event that will run through November. The music-focused event will see the battle royale partnering with Snoop Dogg, Eminem, and more artists to remix the game’s map each week.

The new announcement is the next phase of Epic’s goal to turn Fortnite into the “ultimate hub for social entertainment experiences,” as a representative from Epic explained during a press event. The game has had several musical collaborations previously, including the Guitar Hero-like Fortnite Festival mode that launched last year. This update brings that idea to the next level by partnering with four musical superstars.

Read more