NVIDIA releases the Titan V graphics card focused on AI processing

Adjust Comment Print

Nvidia launched a new desktop GPU today that's created to bring massive amounts of power to people who are working on machine learning applications.

That said, a liberal definition of the word "consumer" is in order here - the Titan V sells for $2,999 and is focused around AI and scientific simulation processing.

Nvidia's Titan V is the first "consumer-grade" card (i.e. the Titan brand, as opposed to Tesla) to use the company's new Volta technology (the next step on from Pascal, which powers the current GeForce 10-series cards), and it boasts no less than 21.1 billion transistors, delivering a jaw-dropping 110 teraflops of computing power. Both the Titan V and V100 cards seem to have quite a bit in common, including the V100 GPU at their core, though the Titan V features both less memory and a reduced memory bus width. There are 640 Tensor Cores and 5120 CUDA cores, compared to 3840 on the previous-gen Titan Xp.


So while PC gamers with deep pockets will be able to get higher frame-rates in video games and people working on video editing and graphics-heavy computer-assisted design applications are likely to see smoother performance, the Titan V really stands as a gateway for developers to move into AI and machine learning without the need for prohibitively expensive computer and server setups.

The Titan V has been developed keeping machine learning and artificial intelligence workloads in mind.

Strangely, Nvidia Titan V does not come with NVLink or SLI as it both lacks the support and the actual NVLink connector is blocked with a cooler.


Nvidia founder and CEO Jensen Huang said in a statement -"Our vision for Volta was to push the outer limits of high performance computing and AI".

"With TITAN V, we are putting Volta into the hands of researchers and scientists all over the world".

"With independent parallel integer and floating-point data paths, Volta is also much more efficient on workloads with a mix of computation and addressing calculations", Nvidia explained in a news release.


Comments