Skip to main content

Intel XeSS is already disappointing, but there’s still hope

Intel’s hotly anticipated Xe Supersampling (XeSS) tech is finally here, and a couple weeks before Intel’s Arc Alchemist GPUs show up. It’s available now in Death Stranding and Shadow of the Tomb Raider, and more games are sure to come. But right now, it’s really difficult to recommend turning XeSS on.

Bugs, lacking performance, and poor image quality have sent XeSS off to a rough start. Although there are glimmers of hope (especially with Arc’s native usage of XeSS), Intel has a lot of work ahead to get XeSS on the level of competing features from AMD and Nvidia.

Recommended Videos

Spotty performance

Norman Reedus crying in Death Stranding.
Image used with permission by copyright holder

Before getting into performance and image quality, it’s important to note that there are upscaling models for XeSS. One is for Intel’s Arc Alchemist GPUs, while the other uses DP4a instructions on GPUs that support them. Both use AI, but the DP4a version can’t do the calculations nearly as fast as Arc’s dedicated XMX cores. Because of that, the DP4a version uses a simpler upscaling model. On Arc GPUs, performance shouldn’t only be better, image quality should be, too.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

We don’t have Arc GPUs yet, so I tested the DP4a version. To avoid any confusion, I’ll refer to it as “XeSS Lite” for the remainder of this article.

That’s the most fitting name because XeSS Lite isn’t the best showcase of Intel’s supersampling tech. Death Stranding provided the most consistent experience, and it’s the best point of comparison because it includes Nvidia’s Deep Learning Super Sampling (DLSS) and AMD’s FidelityFX Super Resolution 2.0 (FSR 2.0).

XeSS performance results for the RTX 3060 Ti in Death Stranding.
Image used with permission by copyright holder

With the RTX 3060 Ti and a Ryzen 9 7950X, XeSS trailed in both its Quality and Performance modes. DLSS is the performance leader, but FSR 2.0 isn’t far behind (about 6% lower in the Performance mode). XeSS in its Performance mode is a full 18% behind DLSS. XeSS is still providing nearly a 40% boost over native resolution, but DLSS and FSR 2.0 are still significantly ahead (71% and 61%, respectively).

The situation is worse with AMD’s RX 6600 XT. It seems XeSS Lite heavily favors Nvidia’s GPUs at the moment, as XeSS only provided a 24% boost in its Performance mode. That may sound decent, but consider that FSR 2.0 provides a 66% jump. In Quality mode, XeSS provided basically no benefit, with only a 3% increase.

XeSS results for the RX 6600 XT in Death Stranding.
Image used with permission by copyright holder

Shadow of the Tomb Raider also shows the disparity between recent Nvidia and AMD GPUs, but I’ll let the charts do the talking on that front. There’s a much bigger story with Shadow of the Tomb Raider. Across both the RX 6600 XT and RTX 3060 Ti, XeSS would consistently break the game.

I was able to finally get the Performance mode to work by setting the game to exclusive fullscreen and turning on XeSS in the launcher (thank goodness this game has a launcher). If I turned on XeSS in the main menu, the game would slow to a slideshow. And in the case of the Quality mode, I couldn’t get a consistent run even with the launcher workaround.

A new update for Shadow of the Tomb Raider reportedly fixes the bug, but we haven’t had a chance to retest yet. For now, make sure to update to the latest version of the game if you want to use XeSS.

I tried out Shadow of the Tomb Raider on my personal rig with an RTX 3080 12GB, and it worked great without the launcher workaround. This is the case for many GPUs, and the update should fix the startup crashes that were occurring for others.

Poor image quality

The performance for XeSS isn’t great right now, but the more disappointing factor is image quality. It lags behind FSR 2.0 and DLSS a bit in the Quality mode, but bumping down to the Performance mode shows just how far behind XeSS is right now in this regard.

Shadow of the Tomb Raider is the best example of that. DLSS looks a bit better, utilizing sharpening to pull out some extra detail on the distant skull below. XeSS falls apart. In Performance mode in Shadow of the Tomb Raider, XeSS looks like you’re simply running at a lower resolution.

Image quality in Shadow of the Tomb Raider.
Image used with permission by copyright holder

This is zoomed in quite a bit, so the difference isn’t nearly as stark when zoomed out. And the Quality mode holds up decently. It still suffers from the low-res look when zoomed in so much, but the differences are much harder to spot in Quality mode when you’re actually playing the game.

XeSS image quality in Shadow of the Tomb Raider
Image used with permission by copyright holder

Death Stranding tells a different story — and largely because it includes FSR 2.0. In Quality mode, FSR 2.0 and native resolution are close, aided a lot by FSR 2.0’s aggressive sharpening. DLSS isn’t quite as sharp, but it still manages to maintain most of the detail on protagonist Sam Porter Bridges. XeSS is a step behind, though it’s not as stark as Shadow of the Tomb Raider. It manages to reproduce details, but they’re not as well-defined. See the hood and shoulder on Bridges and the rock behind him.

XeSS Quality comparison in Death Stranding.
Image used with permission by copyright holder

Performance mode is where things get interesting. Once again FSR 2.0 is ahead in a still image with its aggressive sharpening, but XeSS and DLSS are almost identical. Performance is behind, and Intel still needs to work with developers for better XeSS implementations. But this showcases that XeSS can be competitive. One day, at least.

XeSS Performance comparison in Death Stranding.
Image used with permission by copyright holder

Just like with performance, it’s important to keep in mind that this isn’t the full XeSS experience. Without Arc GPUs to test yet, it’s hard to say if XeSS’ image quality will improve once it’s running on the GPUs it was intended to run on. For now, XeSS Lite is behind on image quality, though Death Stranding is proof that it could catch up.

Will XeSS hold up on Arc?

Intel Arc A750M Limited Edition graphics card sits on a desk.
Intel

XeSS is built first and foremost for Intel Arc Alchemist, so although these comparisons are useful now, it’s all going to come down to how XeSS can perform once Arc GPUs are here. XeSS Lite still needs some work, especially in Shadow of the Tomb Raider, but Death Stranding is a promising sign that the tech can get there eventually.

Even then, it’s clear that XeSS isn’t the end-all-be-all supersampling tech it was billed as. It’s possible that by trying to do both machine learning and general-purpose supersmapling, XeSS will lose on both fronts. For now, though, we just have to wait until we can test Intel’s GPUs with XeSS.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Intel’s upcoming Arrow Lake CPUs might run into cooling trouble
The cold plate and heat pipes on the Noctua NH-D15 G2 CPU cooler.

By nearly all accounts, Intel is gearing up to release its 15th-gen Arrow Lake CPUs in a matter of weeks. The new generation, which will compete for a slot among the best processors, will use the new LGA 1851 socket, and the redesigned package might be problematic when it comes to keeping the CPU cool.

According to famed overclocker and YouTuber der8auer, the hot spot on Arrow Lake CPUs is "quite a bit further north," meaning that the hottest part of the CPU is situated at the top of the package. Different hot spot locations is nothing new -- for instance, AMD's Ryzen 9 9950X has a hot spot more toward the southern part of the package -- but it's something that cooling companies will need to account for in order to get the best performance.

Read more
Intel’s instability update cuts speed by up to 6.5% — but don’t panic yet
Intel Core i5-14600K processor inside its socket.

Intel has finally gotten a grip on its disastrous instability problems that have been the bane of some of Intel's best processors for nearly a year, including the Core i9-13900K and Core i9-14900K. The update was released last week, and users are now taking it out for a spin. And unfortunately, some are reporting performance drops of up to 6.5%.

A user on the Chiphell forums tested the new BIOS patch that is supposed to address instability on Intel's 13th-gen and 14th-gen CPUs. The user twfox saw a drop of around 6.5% with the Core i9-13900K in Cinebench R15's single-core test, at least compared against Wccftech's own tests. In the more recent Cinebench R23, the Core i9-14900K dropped about 2% of its multi-core score, falling behind AMD's Ryzen 9 7950X.

Read more
Intel may have been right about killing Hyper-Threading after all
A Core i9-12900KS processor sits on its box.

Intel is getting rid of one of the features that has defined most of the best processors for more than a decade -- Hyper-Threading. It's the branded name Intel uses for simultaneous multi-threading, or SMT, and the company has already confirmed it won't use SMT on its upcoming Lunar Lake mobile CPUs. Rumor has it the company is also ditching SMT for its Arrow Lake desktop CPUs. Surprisingly, according to new leaks, killing SMT might have been the right call after all.

A handful of benchmarks have leaked for Arrow Lake CPUs. Starting off, the Core Ultra 7 265K and Core Ultra 9 285K both popped up in the Geekbench 6 database. The flagship Core Ultra 9 is a 24-core part, and it achieved a score of 21,075 in Geekbench 6's multi-core test. That's slightly above what you'll see with the Ryzen 9 9950X and on-par with the Core i9-14900K, both of which come with 32 threads due to SMT.

Read more