Skip to main content

Nvidia’s Titan X is revealed with 12GB frame buffer for $999

nvidia titan x specs confirmed titanx
Nvidia
Nvidia has fully revealed the Titan X at the GPU Technology Conference, and it’s a doozy of a video card.

The Titan X will reportedly boast 3,072 CUDA cores and is built on the Maxwell architecture using a 28nm process. In this sense, then, it’s not all new, but rather an improvement on the existing cards in the GTX 900 series.

It also offers 12GB of GDDR5 video memory, the most of any single-GPU video card. This is connected over a 384-bit interface. That distinguishes the card from its 900 series brethren, all of which have a 256-bit (or smaller) memory interface. That adds up to 336 gigabytes per second of memory bandwidth, over 100GB/s more than the GTX 980, and almost twice that of the PlayStation 4.

Though obviously very powerful, the card’s use of Maxwell architecture (rather than Kepler, found in previous Titan cards) allows it to hit a thermal design power of 250 watts. It uses less power than the old GTX 780 Ti and is fed with an 8+6 pin PCIe power connector arrangement.

Additionally, it will support up to 4-way SLI and is compatible with Nvidia GameStream and the Nvidia Shield. Standard video output support includes one DVI, one HDMI and three DisplayPort, though third-party card vendors may change that. The HDMI port conforms to the 2.0 standard. None of that information is particularly surprising, though. We’d expected it to retain all of the usual Nvidia features, and so it does.

Interestingly, unlike the Titan Z, the new card will not offer double precision support. That feature is often required by enterprise users to ensure data is absolutely accurate. However, Nvidia is still targeting certain research applications. It called out machine learning as a particular area of interest, stating it can cut research tasks that used to take a month and a half on a computer processor down to just a few days.

Nvidia’s CEO, Jen-Hsun Huang, also spent a significant amount of time talking about neural networks, specifically those used to analyze images. He highlighted how advanced video cards like the Titan X can accelerate these networks, making it possible to categorize images without human intervention, or even convert images to text descriptors.

We should have a first look at the card up tomorrow.

Brad Jones
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Nvidia is ‘unlaunching’ the 12GB RTX 4080
Nvidia GeForce RTX 4090 GPU.

Nvidia is "unlaunching" the 12GB RTX 4080. The company announced its plans through a blog post, and shared the following comment with Digital Trends: "We are unlaunching the RTX 4080 12GB. Having two GPUs with the 4080 designation is confusing. You will not see it on Nov. 16."

The RTX 4080 was originally announced in 12GB and 16GB variants, which caused a lot of community backlash. With fewer cores and less power, the 12GB model felt more like a rebranded RTX 4070. Nvidia seems to agree now, but it hasn't said whether or not it will rebrand this model as the RTX 4070. The company also hasn't said if it will adjust the list price of $900.

Read more
Why the RTX 4080 12GB feels a lot like a rebranded RTX 4070
Four RTX 40-series graphics cards on a black background.

Nvidia announced two versions of its RTX 4080 at its GTC keynote -- a 12GB model and a 16GB model. On the surface, this seems simple. Two configurations of the same graphics cards, except with different amount of memory.

This is, after all, what Nvidia did with its RTX 3080 last year. There was the original 10GB RTX 3080, and the 12GB RTX 3080 that got released earlier this year.

Read more
Nvidia’s bizarre ‘GTX 2080’ GPU emerges out of hiding
An Nvidia GeForce GTX 2080 graphics card.

A surprising graphics card made its way into the hands of a Redditor: The Nvidia GeForce GTX 2080. No, that's not a typo, and yes, such a card doesn't really exist -- and yet, here it is, pictured in its full glory.

The mysterious GPU is an engineering sample that the Reddit user was able to pick up on eBay. They then took the card for a spin and compared it to the GPU that was actually released, meaning the RTX 2080.

Read more