Skip to main content

AMD Radeon R9 Fury X review

AMD's R9 Fury X graphics card swings hard, can't land a knockout punch

AMD Radeon Fury X
Bill Roberson/Digital Trends
AMD Radeon R9 Fury X
MSRP $649.00
“The Fury X is the most impressive video card we’ve seen in years, but it doesn’t deliver the knock-out blow AMD needs.”
Pros
  • Compact, attractive
  • Stock liquid cooling
  • Competitive performance-per-watt
Cons
  • Liquid cooler’s radiator is unwieldy
  • Doesn’t beat the GTX 980 Ti

The year was 2007, and AMD’s back was against the wall. Its Athlon architecture, which challenged Intel at the turn of the century, was aging. It needed a savior, so it went big, designing and all-new processor called Phenom.

Recommended Videos

It was an impressive piece of engineering – but it failed to keep up with Intel’s latest, and early models contained a rare but nasty bug. Phenom’s failure to keep up the company’s fortunes was a turning point, and AMD processors haven’t been able to go toe-to-toe with Intel’s best ever since.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Now, AMD finds itself at another turning point. The company’s schedule of graphics architectures has fallen behind its chief competition, Nvidia. AMD’s answer is yet another dramatic all-new design; the Radeon R9 Fury X, the first video card to use High Bandwidth Memory.

Will it make up the widening gap? Or is HBM, like the Phenom, more brilliant in theory than in practice? The future of AMD may ride on this card. So, you know – no pressure!

It’s all about HBM

The Fury line’s spotlight on memory is unusual, as it’s rarely the focus of a new video card release. Aside from the amount of memory (in this case, 4GB), and the width of the memory interface used, there’s typically not much comment on RAM. That’s not to say it’s unimportant – but it is typically a known quantity.

Here, AMD has flipped the script. The GPU architecture, known as Fiji, is technically new, but it’s really a variation of the Hawaii chip in AMD’s Radeon R9 290X, which itself is a variation of its predecessors. The Fiji’s 4,096 Stream Processors are arranged into 64 compute units and four primary Shader Engines. Stream Processor count per Compute Unit has risen to 16, the most seen so far, and the overall SP count is way ahead of anything AMD has produced before. The result is a quoted compute performance of 8.6 Teraflops, which is way ahead of anything else on the market – the Nvidia Titan X quotes a peak compute power of 6.14 TFLOPS.

But the basics are the same as any other current Radeon. The Fury X’s chip still fits in the Graphics Core Next family, and is still built on the tried-and-true 28 nanometer production process.

AMD Radeon Fury X
Bill Roberson/Digital Trends
Bill Roberson/Digital Trends

The real news, then, is the memory. We’ve already written on the matter extensively, as AMD leaked out details of High Bandwidth Memory well in advanced, but here’s the summary of what you need to know. HBM stacks memory chips vertically and places them very close to the GPU itself. This is a more efficient use of both space and power. AMD says that, relative to a GDDR5 card, memory bandwidth per watt increases up to three and a half times, and overall bandwidth per chip improves almost four-fold.

Certainly, the on-paper results are impressive. The Fury X quotes overall memory bandwidth of 512 gigabytes per second, again well ahead of a GeForce GTX Titan X, which quotes 336 gigabytes per second (the GTX 980 Ti’s bandwidth is also 336GB/s). The real question is whether that bandwidth improvement – and the drastic increase in Stream Processor count – will be enough to put the card in contention with Nvidia’s latest.

Fury X card itself and pricing

While the Fury X seems to compete with the GTX Titan X, based on its specification sheet, on price it actually lines up with the GTX 980 Ti. Nvidia’s second-quickest video card carries an MSRP of $649, and the Fury X mimics it exactly. This makes comparisons between the two rather simple. There’s no need to handicap either because of pricing.

At a glance, that works to AMD’s advantage. For the first time in years the company’s build quality and design exceeds the green team – and the gap isn’t small. The GTX 980 Ti is a gorgeous card, but it’s also large and air-cooled. You need a lot of horizontal space and a case with solid air-flow.

The Fury X comes in at just under eight inches long – about 2.5 inches shorter than not only the Nvidia GTX 980 Ti, but also the GTX 980, 970 and 960 (the standard PCB length is the same for all three). AMD also ships its card with a liquid cooler. The GTX 980 Ti requires one 8-pin and one 6-pin power connector, while the Fury X needs two 8-pin connections.

It’s not all sunshine and roses, though, because the card’s smaller size and larger cooler cancel each other out. Yes, the Fury X itself is smaller than the GTX 980 Ti, but when the cooler’s mass is included, it’s a bit larger. It also demands a 120mm fan mount dedicated to it, and builders must find a way to route the flexible tubes that carry liquid to and from the radiator. Taken as a whole, the Fury X is arguably more difficult to install in a small case than the GTX 980 Ti – though the particulars will depend on the enclosure you use.

Performance

Clearly, the Fury X is quite a bit different from anything AMD has tried in recent years. Its new form of memory results in a card that’s smaller, and more refined. Yet design is only part of the equation. Nvidia has gained the upper hand in recent years because of performance and efficiency, so let’s see if the Fury can turn the tide.

Here are the cards we’re using for comparison. Not every card is used in every comparison. The focus is on how the Fury X stacks up against the GTX 980 Ti, so it’s in every match-up.

All of the testing was performed in our Falcon Northwest Talon test rig. It’s equipped with a Core i7-4770K processor (not over-clocked), 16GB of RAM and two 240GB solid state drives. We used the latest drivers for testing in all cases.

Let’s get to it.

3DMark

I’ll start with 3DMark, our only synthetic test. While much can be argued about how accurate any synthetic test can be, I’ve found this benchmark to be a wonderfully accurate indicator of general performance. So, what does the demanding Fire Strike loop have to report?

There you have it. Sorry if it’s anti-climactic, but here, with one result, we can more or less gauge if the Fury X is up to beating the GTX 980 Ti. And it isn’t. Instead, it’s basically a tie.

I say that because the particular GTX 980 Ti we have available for this review is slightly overclocked. The base clock is 102MHz quicker, and the boost clock 114MHz quicker, than a standard GTX 980 Ti. That’s not enough to create a major gap, and accordingly the card retails for just $20 more than a standard 980 Ti. But it might be enough to create the tiny, five-percent difference between it and the Fury X.

This result is simultaneously impressive and disappointing. It’s nice to see a compact AMD card going head-to-head with Nvidia and holding its own, but after a couple years of disappointing releases, the red team needed to slug one out of the park.

Battlefield 4

DICE’s well-known and often controversial first-person shooter is no longer among the most demanding games available, but it’s still not easy to play at high resolution and with all the details turned on. Let’s start first with the game’s performance at 1080p and the Ultra preset detail. Note this is DirectX performance – Mantle unfortunately throws our framerate recording method for a loop, and so cannot be used here.

As you can see, the Fury X comes up a fair bit behind in this scenario. It runs at a whopping 30 percent deficient to the GTX 980 Ti. On the other hand, though it loses, the card still easily produced a framerate almost twice the preferred minimum of 60 frames per second. AMD has repeatedly stated the Fury X should prove most competitive at 4K, so let’s see if that’s true.

AMD wasn’t talking nonsense. At 4K the Fury X makes a huge leap forward. It’s still behind the GTX 980 Ti, but only by a handful of frames per second. We should also note it delivered extremely similar minimum framerates – the GTX 980 Ti hit a minimum of 39, and the Fury X bottomed out at 35. While neither card managed the ideal 60 FPS, gameplay was enjoyable on both cards.

Shadows of Mordor

This award-winning is a great technical benchmark because it’s among a new wave of cross-platform games designed with consoles in mind that are still capable of putting PCs to task when all the settings are kicked up to maximum. Again, we’ll start at 1080p.

In this comparison, we see a less extreme repetition of Battlefield 4. The Fury X is behind the GTX 980 Ti, but it’s not as far behind as before. All three of these high-end cards have no problem delivering an experience that well exceeds a 60 FPS average, even with the notoriously difficult Ultra texture pack installed.

Let’s move on to 4K.

Again, the Fury X is more competitive at 4K than it is at 1080p, which is important, because only the 4K result really matters. None of these three, not even the mighty Titan X, manage the ideal 60 FPS, but they all come quite close.

The story doesn’t end here, unfortunately. While the Fury X’s average is similar, it hits a minimum framerate of 29, while the GTX 980 Ti goes no lower than 40 FPS. That’s a significant difference, and it was noticeable in game, as the AMD card was prone to occasional fluctuations in framerate.

“I also noticed a strange artifact with the Fury X. Horizontal bands of static would occasionally appear, blinking for a fraction of a second. The problem appeared sporadically, and seemed to occur more often with higher framerates. AMD’s representatives informed me the issue is unusual, and I’ve been working with them towards a solution. This review will be updated if the problem is resolved.”

Grand Theft Auto V

Our latest addition to the benchmark suite, this game requires little introduction. You’re more likely to have played this than any other game we’ve benchmarked except, perhaps, League of Legends. The PC version is a surprisingly well-done port that looks beautiful with the details ramped up, but it’s also extremely demanding.

I didn’t have the chance to run the Titan X through this game, so the GTX 980 Ti and Fury X go heads-up in this match, starting with 1080p.

This game has no presets. For our “ultra” calibration we turned FXXA on, but left MSAA off. We used the highest level of detail available for every other setting.

If the performance story of these cards wasn’t clear already, Grand Theft Auto V fills out the missing details. Yet again, AMD’s latest loses at 1080p resolution – but yet again, the framerate is far higher than required to provide an enjoyable experience. Now it’s time for what really matters; 4K.

Once again, the Fury X closes as the resolution sky-rockets, though it’s not quite enough to catch up to the GTX 980 Ti. The six second gap represents a difference of about 12 percent, which is more substantial than the 3DMark results would lead you to think.

There is an upside, though. At 4K the Fury X had a minimum framerate of 23 FPS, while the GTX 980 Ti went as low as 16. Unlike Shadows of Mordor, where the Nvidia card proved more reliable, Grand Theft Auto V seems to prefer the Fury X. The result was occasionally noticeable in-game, as the GTX 980 Ti suffered from a more noticeable frame-rate dip in intense scenes.

Heat is a close call

Heat and power draw was a major issue for AMD in the past, and arguably the primary cause of its woes. Less efficient chips means the cards holding them need more power and bigger, noisier coolers. While the red team has managed to offer value by slashing prices and introducing ever more monstrous cards, many gamers have steered towards Nvidia for quieter, cooler high-performance desktops.

Obviously, High Bandwidth Memory is a play to fix that flaw, but whether it’d be successful wasn’t obvious at a glance. Yes, the memory uses less power, but the GPU itself hasn’t drastically changed. So the question was this – would the benefits of HBM offset AMD’s less efficient GPU?

The answer, it seems, is yes. Take a look for yourself.

AMD’s new Fury X and Nvidia’s GTX 980 Ti performed similarly across the board. The biggest difference is a mere 15 watts while playing Shadows of Mordor. Both cards also exhibited similar, and very quiet, fan operation. In fact, total system noise from our test rig remained the same no matter the load each card faced. What this really means is that video card noise was not a significant contributor to overall system noise, which is quite remarkable.

Internal operating temperatures were different, however. The GTX 980 Ti ran at 54 degrees Celsius at idle, and up to 76C at load. AMD’s liquid-cooled card, though, had an idle temperature of only 35C and a maximum load temperature of 60C. Those figures give a clear edge to the Fury’s cooling configuration, and suggest more over-clocking headroom.

Conclusion

So, here’s the moment of truth. Should you buy the Radeon R9 Fury X instead of the GTX 980 Ti?

Probably not.

The Fury X is an impressive card. Were it alone in its bracket, or priced slightly differently, it’d be easy to recommend. At $649, though, it doesn’t quite unseat the GTX 980 Ti. Nvidia’s card is a bit quicker across the board, has more RAM (which doesn’t seem to matter now, but perhaps it will, someday), and is easier to install.

If you think my reasoning is flimsy, you’d be right. In truth the difference between the Fury X and its green-team opposition is small. You could buy either card and be happy with the results, even if you own a 4K monitor. Yet there’s only reason to buy one, and the GTX 980 Ti’s slight advantages are enough to give it the nod. The Fury X is a technological tour-de-force, but it doesn’t deliver the clear-cut victory AMD needed.

Highs

  • Compact, attractive
  • Stock liquid cooling
  • Competitive performance-per-watt

Lows

  • Liquid cooler’s radiator is unwieldy
  • Doesn’t beat the GTX 980 Ti
Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
iPhone 17 series could finally end Apple’s stingy era of slow screens
iPhone on charging stand showing photo screen in iOS 17 StandBy mode.

Apple has played a relatively slow innovation game when it comes to display upgrades on its phones. The company took its own sweet time embracing OLED screens, then did the same with getting rid of the ugly notch, and still has a lot of ground to cover at adopting high refresh rate panels.

The status could finally change next year. According to Korea-based ET News, which cites an industry source, Apple will fit an LTPO (low-temperature polycrystalline oxide) screen across the entire iPhone 17 series, including the rumored slim version and the entry-point model.

Read more
Aptera’s 3-wheel solar EV hits milestone on way toward 2025 commercialization
Aptera 2e

EV drivers may relish that charging networks are climbing over each other to provide needed juice alongside roads and highways.

But they may relish even more not having to make many recharging stops along the way as their EV soaks up the bountiful energy coming straight from the sun.

Read more
Ford ships new NACS adapters to EV customers
Ford EVs at a Tesla Supercharger station.

Thanks to a Tesla-provided adapter, owners of Ford electric vehicles were among the first non-Tesla drivers to get access to the SuperCharger network in the U.S.

Yet, amid slowing supply from Tesla, Ford is now turning to Lectron, an EV accessories supplier, to provide these North American Charging Standard (NACS) adapters, according to InsideEVs.

Read more