Skip to main content

3 things Intel XeSS needs to nail to beat Nvidia DLSS

With features like ray tracing and increasingly complex visuals, graphics cards need a supersampling solution to maintain playable frames. Nvidia pioneered this concept with Deep Learning Super Sampling (DLSS), and Intel’s upcoming graphics cards will use a similar feature called XeSS.

In short, XeSS uses machine learning to upscale an image from a smaller internal resolution to a larger external one. DLSS does the same thing, but it requires a recent Nvidia RTX graphics card with dedicated Tensor cores. XeSS doesn’t. Instead, Intel is making two versions available — one that will leverage the dedicated hardware inside its upcoming Arc Alchemist graphics cards, and another that will serve as a general-purpose solution for a wide range of hardware.

That alone gives XeSS a leg up over DLSS, but it’s not enough. Here are three things XeSS needs to take the supersampling crown from Nvidia and cement Intel’s place in the graphics card market.

Quality and performance

XeSS really doesn’t matter if it can’t hit the performance and quality marks set by DLSS. Intel hasn’t demoed the feature running in any games, only in a 4K demonstration that provided vague performance hints. From that, we have an idea of what XeSS is capable of, but Intel still needs to show more.

Intel XeSS quality comparison.

Right now, Intel says XeSS can provide up to a 2x performance improvement over native 4K, and that it can upscale 1080p to an effective 4K with virtually no quality loss. That matches DLSS 2.0, but Nvidia has verifiable performance numbers and comparative screenshots in games you can play.

Claiming virtually no quality loss doesn’t mean much for Intel. Nvidia’s first DLSS implementation, for example, was abhorrent in terms of quality, smudging out too much detail to justify any performance gains. Intel can’t afford to release XeSS in a similar state, so it needs to nail the performance gains while maintaining as much quality as possible.

To be clear, a 2x performance improvement with virtually no quality loss is the minimum that Intel needs to achieve. That’s the bar DLSS set, and Intel has made it clear that XeSS is squarely targeting Nvidia’s tech. In an ideal world, Intel would push XeSS even further.

XeSS also needs multiple quality modes. DLSS comes with up to four modes that allow you to tweak the balance of quality and performance. These modes shrink the internal render size, essentially giving the upscaling algorithm less information to work with.

Intel XeSS rendering pipeline demonstration.
The rendering pipeline for Intel XeSS.

Quality modes help DLSS work across a wide range of hardware. A flagship GPU might turn to the Quality mode for a performance boost, but the Performance mode is there for low-end options without a lot of power. Intel hasn’t said if XeSS will support multiple quality modes, but it needs to in order to compete with DLSS.

Easy implementation

One of the benefits of AMD’s FidelityFX Super Resolution (FSR) is how easy it is to add into games. Following its launch, the developer for Edge of Eternity said that it only took “a few hours” to add into the game, contrasting that with the lengthy process DLSS required.

AMD FidelityFX Super Resolution

Intel already understands this point, it seems. Following the launch of FSR, Nvidia made the decision to make DLSS available to all developers. Previously, developers would need to apply and be approved before adding the feature to their games. Intel is launching XeSS with the software development kit (SDK) freely available, which alone is a big deal.

The question is how long the A.I. model takes to produce quality results, and what developers need to do to get XeSS up and running in their games. The easier XeSS is to add into games, the more games will support it. We’ve already seen that with FSR, which experienced rapid adoption following its launch.

Game support is what matters for the long-term success of XeSS, and game support comes through a simple, easy to add SDK. This is all the more important for XeSS because Intel is offering two SDKs. If one is hard enough to implement, good luck getting developers to add two.

Ideally, developers will be able to port the implementation from one SDK to the other. As mentioned, XeSS comes in two forms, each of which require their own SDK. Intel needs to make it easy for developers to add both. If developers are forced to choose, that defeats XeSS’ main claim to fame — support for a wide range of hardware.

Ray tracing

Godfall screenshot with simulated FSR effect
Gearbox

Nvidia usually bundles DLSS with ray tracing, and it’s easy to see why. Supersampling features like DLSS are absolutely great for low-end to midrange hardware that struggles to run new games at high resolutions and stable frame rates. High-end hardware doesn’t need the feature as much, especially since a lot of recent AAA games run great on recent flagship cards.

Enter ray tracing, the complex and realistic lighting calculation that wants nothing more than to bring your high-end GPU to its knees. Ray tracing alone is too demanding, and supersampling alone doesn’t mean much across all hardware in all games. To hit the most users, you need to bundle both together.

With high-end cards, the option becomes running the game at native resolution or turning on ray tracing with supersampling enabled. Intel confirmed that its upcoming graphics cards will support hardware-accelerated ray tracing. However, that only makes a difference if it comes along with XeSS.

Ideally, Intel will go after titles that already support DLSS, as well as upcoming titles that plan to use ray tracing. Regardless, the two features should always arrive together. DLSS combined with ray tracing is greater than the sum of its parts, and that’s something AMD hasn’t caught onto with FSR. Intel can’t afford to make that same mistake.

The multibillion dollar underdog

LEDs forming a graphics card.

Intel is a massive company — it generates far more revenue than AMD and Nvidia. In the world of discrete graphics cards, however, it’s starting at zero. Even if Alchemist cards come out and perform better than their competition (preferably at a lower price), Intel has a long road ahead to establish itself against AMD and Nvidia. It will take several years, and that’s assuming everything goes according to plan.

Performance isn’t enough to enter a market that’s been dominated by two brands for decades. XeSS looks like a feature to separate Intel from the competition, offering supersampling that functions a lot like DLSS without requiring proprietary hardware. With resolutions pushing higher and visual glitter like ray tracing becoming more common, it’s the feature that will help Intel stand apart.

Existing isn’t enough, though. Wide adoption, consistently high quality, and smart feature pairing will make the difference for XeSS. And if Nvidia continues to rest on its laurels, Intel has a shot to establish its supersampling feature as the go-to option.

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
How Nvidia’s DLSS 3 works (and why AMD FSR can’t catch up for now)
Microsoft Flight Simulator with the ReSpec logo.

Nvidia's RTX 40-series graphics cards are arriving in a few short weeks, but among all the hardware improvements lies what could be Nvidia's golden egg: DLSS 3. It's much more than just an update to Nvidia's popular DLSS (Deep Learning Super Sampling) feature, and it could end up defining Nvidia's next generation much more than the graphics cards themselves.

AMD has been working hard to get its FidelityFX Super Resolution (FSR) on par with DLSS, and for the past several months, it's been successful. DLSS 3 looks like it will change that dynamic -- and this time, FSR may not be able to catch up anytime soon.
How DLSS 3 works (and how it doesn't)

Read more
Why AMD doesn’t need to beat the RTX 4090 to overtake Nvidia
Lisa Su, the CEO of AMD, pictured holding an AMD Radeon RX 6900 XT graphics card.

Nvidia RTX 4090 is official, and it's launching in a few short weeks; if you can stomach the $1,600 price tag, that is. Prices are up for RTX 40-series GPUs across the board, from $100 at the low-end to $500 at the high end. That puts Nvidia in a precarious position considering AMD's RX 7000 GPUs are right around the corner.

AMD has an opportunity to steal the crown for the best graphics card, one it hasn't had in the last several generations. Given the prohibitively expensive prices of RTX 40-series GPUs right now, RX 7000 may be the way to go in the next generation. And AMD doesn't even need to beat Nvidia's most recent graphics cards to get there.
Competing with last-gen

Read more
Nvidia DLSS 3 predicts frames to deliver next-level gaming performance
Screenshot of Nvidia DLSS 3.

Nvidia has just announced DLSS 3 at its GeForce Beyond event, a new generation of its deep learning super sampling technology that will be available on next-gen Nvidia RTX 40-Series graphics cards.

DLSS 3 is a new AI that predicts entire frames instead of just pixels. This time around, it will even boost CPU-reliant games, and Nvidia showed off that capability in Microsoft Flight Simulator.

Read more