Skip to main content

Why is Intel into GPUs now? It was about to get stomped

Intel Raja Koduri

Computer geeks woke this morning to shocking news.

Raja Koduri, former head of AMD’s Radeon Technologies Group, has joined Intel as Chief Architect. He will lead his new employer’s efforts to build discrete graphics hardware for “a broad range of computing segments.”

This will feel like a stab in heart for AMD’s fans. Raja was loved for his confident yet easy-going demeanor, and he’d become the unofficial face of the company’s underdog image. His resignation was bad enough — to have him join Intel a day later was the worst possible outcome.

It’d be easy to overstate such drama, but in this case the fuss is warranted. Intel has never been competitive in graphics hardware, and this hire is the company’s strongest attempt yet.

Why now?

The timing of this move may seem strange, if only because it’s been ages since Intel was serious about graphics. Its last major push began around the debut of the modern Core processor line. For a time, Intel HD graphics seemed to make decent progress — at least enough to be usable. That didn’t last long. Today, Intel’s integrated graphics are well behind entry-level hardware from AMD and Nvidia.

Without excellent graphics, Intel can’t present a package as complete as its competitors.

Intel’s “latest” graphics, in the eighth-generation processors, underscore this. Though called Intel UHD 620 — which sounds like an upgrade over the previous Intel HD 620 — the hardware makes no strides over its predecessor. In 3DMark Fire Strike, a common benchmark, UHD 620 is lucky to exceed a score of 900 while the Nvidia GTX 1050 by comparison scores around 5,400. That’s a big gap, and if you’re saddled with a laptop that lacks graphics from AMD or Nvidia, you already notice it.

This isn’t just about laptops, though. Intel’s press release suggests the company wants to build graphics hardware for a variety of systems and there’s a lot of reasons why Intel might want to. Gamers have proven a loyal group with deep pockets, buying expensive graphics cards even while PC sales have slipped year after year. And then there’s the enterprise world, where Nvidia is currently cleaning house with its high-end solutions that are powering data centers, self-driving cars, and research projects.

The real threat is Apple and Qualcomm

I’m sure Intel would love to see gamers, data centers, and universities buying its own high-end graphics at $500 a pop. That, however, is only part of Intel’s goal. Intel’s decision is more influenced by an impending war over the heart and soul of computers.

Since the mid-90s, virtually all home PCs have been sold with Windows running on Intel processors. The term “Wintel” has fallen out of fashion, but that hasn’t changed the reality. Intel Inside has been synonymous with the PC for 25 years, particularly among laptops and 2-in-1s. Most people don’t even think about it — and end up with a “Wintel” machine by default.

That dominance is no longer guaranteed. Apple and Qualcomm have major strides in computing performance over the past decade. Though Intel still has the technical edge, everyday PC use isn’t demanding enough to make it obvious. What is obvious, though, is how badly these competitors thrash Intel’s graphics. While iPads and smartphones can display rich 3D graphics, Intel’s hardware struggles to run new games at their lowest settings.

2018 will be the year this threat becomes real. Qualcomm and Microsoft have partnered to produce Windows 10 laptops that are compatible with all current Windows software, and the first products look set to appear at CES 2018 — and perhaps earlier. Shoppers buying next year might leave with an inexpensive, LTE-capable 2-in-1 powered by Qualcomm. It’s not hard to imagine how an inexpensive, thin, long-lasting, always-connected computer could damage Intel.

Intel is aware of this. In fact, it’s already threatened to sue Qualcomm and Microsoft over the issue, claiming the x86 emulation used to accomplish it is infringement on Intel’s patents.

Apple, meanwhile, has the iPad Pro. While not as direct an alternative to Intel’s laptops, Apple clearly wants people to buy an iPad instead of a traditional PC, and its hardware is now powerful enough to make that a convincing choice. The iPad Pro is still missing a few pieces of the puzzle — the keyboard isn’t great, for example — but a few generations might iron out those issues. Think about it: If you could buy an iPad with a good keyboard for $600 to $800 bucks, why wouldn’t you? The iPad will be more versatile and portable than any Intel-powered 2-in-1.

And — of course — it will deliver eye-candy Intel HD can’t hope to match.

Good news, or bad news?

That’s why Intel needs better graphics hardware. Without excellent graphics, Intel can’t present a package as complete as its competitors. Companies like Dell and Samsung want a single chip that can do everything. Pairing Intel hardware with a separate graphics chip isn’t ideal. The Intel-AMD partnership, announced just a few days ago, is just a band-aid over a wound that needs deeper attention.

Raja Koduri can mend that wound, but it will take time. You shouldn’t expect to see Intel-branded graphics cards on store shelves next year. I’d guess we won’t see real progress until the latter half of 2019, and that could easily slip into 2020, or even later.

The timeline is important, because Intel’s peers are quick. As mentioned, you’ll see Qualcomm-powered laptops in stores next year. By late 2019, Apple will have blitzed through two iPad hardware cycles, and Qualcomm may be ready to introduce its third generation of laptop hardware.

It’s impossible to say if Intel’s new effort will be competitive, but whatever the result, there’s undoubtedly a new war over the PC’s future. All the big names in tech will be involved, and the result, whatever it is, will be visible in the next computer you buy.

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
Intel hasn’t given up on GPUs yet, and we should all be happy about that
The Arc A770 graphics card running in a PC.

Against all odds, it appears that Intel is still on course with its next-gen graphics cards, and its plans reach far into the future. A new report tells us that Intel may have ordered a large number of GPU chips from TSMC.

While Intel Arc struggles to compete against the best graphics cards, this is still great news for the GPU market as a whole -- here's why.

Read more
Nvidia’s outrageous pricing strategy is exactly why we need AMD and Intel
Nvidia GeForce RTX 4090 GPU.

If you're finding it hard to keep up with the prices of graphics cards these days, it's not just you. GPUs have been getting pricier with each generation, and Nvidia's latest RTX 40-series is really testing the limits of how much consumers are willing to spend on PC hardware.

Nvidia may have the best GPUs available right now in terms of raw performance, but the way these new GPUs have been priced shows why the role of Intel and AMD is more important than ever.
GPU prices are through the roof

Read more
Intel may have found the solution to Nvidia’s melting GPUs
Nvidia GeForce RTX 4090 is shown along with a hand holding the power cable adapter.

The 12VHPWR connector found in Nvidia's best graphics cards has had its fair share of issues. After dozens of cases of the connector melting during regular usage, the most common cause may have been found, but a permanent solution to the problem has been elusive. Up until now, that is.

Surprisingly, the possible fix comes from Intel, not Nvidia. The company issued a recommendation regarding the design of the connector.

Read more