At a Stanford AI meetup today, NVIDIA released a new high-end workstation GPU, the Titan X. The NVIDIA Titan X is going to be a popular card in the AI arena, especially for those doing machine learning on the desktop. We were expecting both a larger part than the GeForce GTX 1080 and an update to the Titan line with Pascal, we now have that. MSRP of the cards is ~$1,200. This is an increase over the previous generation Titan X and even ebay pricing for the Maxwell Titan X is still around $900 at the time of the Pascal launch.
NVIDIA Titan X Pascal Edition
Since Pascal is a relatively known architecture, at this point, we can see a few similarities to previous designs. First, the Titan X Pascal edition has a similar industrial design as the NVIDIA 1080, 1070, and 1060.
NVIDIA’s site states that the card utilizes its higher-end vapor chamber cooler and from the pictures, the coloring is different than the consumer 1080.
In terms of specs, as a higher-end card we see more CUDA cores, 3584 in this edition up from 2560 in the GTX 1080. The GDDR5X RAM quantity is increased from 8GB to 12GB as well although there is no mention of HBM onboard.
NVIDIA Titan X Power conusmption is rated at 250W for the Pascal edition as well. For those looking at doing VR work or machine learning on the desktop, this is now the card to get before moving into the Tesla/ GRID range. We have heard that NVIDIA has had difficulty with moving machine learning teams over to the higher margin Tesla/ GRID cards since it is very common practice to buy desktop cards and use them in 4U server chassis designed for GPUs.
The question everyone is probably wondering: Why is this called the Titan X?
Now there is an obvious problem brewing. NVIDIA is calling this card the Titan X. The Maxwell Titan X was launched in March 2015 and also featured 12GB of memory. For those buying the cards or looking to lease them from cloud service providers, the fact that NVIDIA picked Titan X and not Titan X2 or 2X, Titan AA, Titan XX or something similar is something we do not have an answer for.
Want to make a guess at why NVIDIA decided to stick with the exact same name for the GP102 card? Feel free to comment in the STH forums thread.