• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Titan X announced, 12GB framebuffer, 8 billion transistors.

So this titan will generate a lot more heat than my GTX 980 right?

I cant imagine SLI scenarios, the air coming out the back of my case is so hot already with just a 980 & i7 4790K stock (havent overclocked since nothing has needed it)
 
So this titan will generate a lot more heat than my GTX 980 right?

I cant imagine SLI scenarios, the air coming out the back of my case is so hot already with just a 980 & i7 4790K stock (havent overclocked since nothing has needed it)

Yes. The connector setup lets it draw up to 300W. But it will probably be a 250w GPU like the 780Ti, all that energy has to go somewhere.
 
A Titan X is probably right around 7x-8x more powerful than the 7850-class GPU in the PS4. It's pretty simple math. It's not like a Titan X can't pull off the tricks the PS4 does. smh

not only are you clueless to a titans performance relative to the ps4 gpu, but you are also incapable of doing the simplest of math. well done champ!
 

BBboy20

Member
This gif is going places
I like this Seal come back.

That is one of the best GIFs I've ever seen. Holy shit.
The whole dreary sky would seem to fit into a Seal video.

-Q2 release (April to June)
Might be released before Witcher 3?

lololol if the bolded is true
now people can't even justify the stupid price by saying it's a quadro alternative

Also getting a 384bit memory bus for 1300 dollars? They really have no shame

anyhow, it's called titan which means it's going to be horribly overpriced.
Don't support this, wait for the 1080 or 390x and buy either of those
I seem to notice most here aren't reacting to that.

It's interesting to note that 1.5 years after the PS4 launched that what will likely be a $1000 videocard is only 50% more powerful (8GB RAM vs 12GB here). This is why I find it difficult to start PC gaming despite the usually lower game prices.
http://en.wikipedia.org/wiki/Die_shrink

So...probably won't be at $500 and probably won't be released before May.
 
Your tag speaks volumes. Try 2.5-3.5c


Titan X is rumored to have 7Tflops, isn't it? If so it's more like 4 times+. The PS4 GPU has 1.84Gflops + Nvidia GPUs tend to have less processing power, relatively speaking (see GTX980's ~4.6Tflops vs. R290X's ~5.6).
 
It needs to be atleast twice as fast as regular Titan if they're expecting me to pay that much. Just adding a useless amount of VRAM isn't gonna be enough.
 
Titan X is rumored to have 7Tflops, isn't it? If so it's more like 4 times+. The PS4 GPU has 1.84Gflops + Nvidia GPUs tend to have less processing power, relatively speaking (see GTX980's ~4.6Tflops vs. R290X's ~5.6).

thats still not 4+ and theoretical flops dont = real world performance.
 
Sounds pretty good to me especially considering it's still 28nm. But I won't go as far as to say it's worth 1350€/$.

I suppose the audience eyeing this card are not interested in raw perf/price.

lol from a value standpoint its atrocious of course. but that is a very nice perf increase, especially given how much process advancement has slowed. im hoping that they have elected to keep DP at 1/32 as they will have a much harder time marketing the card at 1350 when they cant say "but its a workstation card too!!!" it would also mean all trannys used for gaming perf
 

Chittagong

Gold Member
Titan X is rumored to have 7Tflops, isn't it? If so it's more like 4 times+. The PS4 GPU has 1.84Gflops + Nvidia GPUs tend to have less processing power, relatively speaking (see GTX980's ~4.6Tflops vs. R290X's ~5.6).

Hmmm... Titan X could be the first card to provide a generational leap over PS4 graphics, since I recall the difference in power needs to be 6-8x for the leap feel like a generation, provided I continue to run it in 1080P.

Of course I don't know if there are any recent games that would attempt sufficiently advanced things with their graphics, or whether the processing power would just be wasted in stupidly high framerates.
 
thats still not 4+ and theoretical flops dont = real world performance.

Huh? Of course that's 4+... Nvidia tends to perform like ~30% better if you just go by Flops, i.e. 1.84 Tflops on an AMD card equal like 1.4 on a Nvidia card. That's like ~5 times, maybe more. It's true that flops don't necessarily equal real world performance, but than again they are a pretty good indicator. GTX980 for example has significantly more than twice the performance of R9 270. Now add ~50% on the GTX980 performance and take into consideration that the PS4 GPU is almost 25% slower than the R9 270 and I don't see why 4 times would be outrageous.
 
Hmmm... Titan X could be the first card to provide a generational leap over PS4 graphics, since I recall the difference in power needs to be 6-8x for the leap feel like a generation, provided I continue to run it in 1080P.

Of course I don't know if there are any recent games that would attempt sufficiently advanced things with their graphics, or whether the processing power would just be wasted in stupidly high framerates.

This doesn't make sense.

If you had a game built from the ground-up to run on a Titan X only, a la The Order on PS4, said hypothetical game would be a generational leap ahead of The Order in terms of graphics, assuming the dev team is talented etc. But seeing as this kind of scenario will never happen, and the fact that PC specs vary so wildly, you can't say this offers a generational leap in graphics over consoles, as there are obviously numerous caveats to that statement.

But obviously, in terms of processing power, it is vastly more powerful than the PS4 GPU (a 7850 with 5.5/6GB GDDR5?)
 
The PS4 GPU can't use 6gb for graphics. Remember : 6gb is the total RAM available, to be shared between GPU and CPU.

No, it has 8GB total memory, and iirc 2/2.5GB used for the OS. This usage is likely to shrink as well.

EDIT: Sorry you said CPU. Tbh I'm not sure how the CPU effects that memory pool.
 
Huh? Of course that's 4+... Nvidia tends to perform like ~30% better if you just go by Flops, i.e. 1.84 Tflops on an AMD card equal like 1.4 on a Nvidia card. That's like ~5 times, maybe more. It's true that flops don't necessarily equal real world performance, but than again they are a pretty good indicator. GTX980 for example has significantly more than twice the performance of R9 270. Now add ~50% on the GTX980 performance and take into consideration that the PS4 GPU is almost 25% slower than the R9 270 and I don't see why 4 times would be outrageous.

my god people here just cant do basic math

the closest pc gpu to whats in the ps4 is the r7 265

http://www.computerbase.de/2015-01/nvidia-geforce-gtx-960-im-test/3/
http://www.techpowerup.com/reviews/ASUS/GTX_980_Matrix/28.html

a 980 is roughly 2.4x an r7265. assuming an optimistic 50% perf improvement over a 980, a titan x will top out at 3.7x faster than a 265. thats not 4+, its even further from that guys claim of 7 to 8x, and just worlds away from the roughly 16x perf needed to run the order at 4k/120 fps as claimed by the same guy
 
my god people here just cant do basic math

the closest pc gpu to whats in the ps4 is the r7 265

http://www.computerbase.de/2015-01/nvidia-geforce-gtx-960-im-test/3/
http://www.techpowerup.com/reviews/ASUS/GTX_980_Matrix/28.html

a 980 is roughly 2.4x an r7265. assuming an optimistic 50% perf improvement over a 980, a titan x will top out at 3.7x faster than a 265. thats not 4+, its even further from that guys claim of 7 to 8x, and just worlds away from the roughly 16x perf needed to run the order at 4k/120 fps as claimed by the same guy

I await this day.
 

Theonik

Member
No, it has 8GB total memory, and iirc 2/2.5GB used for the OS. This usage is likely to shrink as well.

EDIT: Sorry you said CPU. Tbh I'm not sure how the CPU effects that memory pool.
Depends. With a shared memory pool you might also use less memory overall as you are not duplicating data for the GPU to use as you might do in a PC.
 

AmyS

Member
[Eurogamer]

How powerful is Nvidia's new 12GB Titan X?

Right now, not much is known with absolute certainty about how powerful the Titan X is - but it was the GPU of choice for VR demos at last week's GDC 2015. Crytek used it for the Back to Dinosaur Island demo, while Epic showcased WETA Digital's Thief in the Shadows and its own Showdown demo with the new technology. We can reasonably assume that it's more powerful than the current top dog, the GeForce GTX 980, but to what degree?

Actual figures on the technical make-up of the card are limited right now - full disclosure is planned for Nvidia's own GTC event a couple of weeks from now. Nvidia CEO Jen-Hsun Huang revealed the card at Epic's Unreal Engine 4 keynote at GDC last Wednesday, giving away just two facts about the product - firstly that it has 12GB of memory, and secondly an eight billion transistor count. Subsequently we learned that the new chip at the heart of the card is called GM200 - effectively confirming 28nm Maxwell architecture similar to the existing GTX 980, making this the true successor to the original Titan's Kepler-based GK110 processor.

However, in terms of actual GPU performance, all we have to go on is the eight billion transistor figure. With the 28nm process and Maxwell architecture all but confirmed, we can compare the transistor count with the GTX 980 to give us some ballpark idea of how much faster the new card is - after all, the vast majority of the extra space on the larger chip will be used to house extra CUDA processing cores. And that's where things get exciting, as potentially we're looking at something in the region of an extra 50 per cent of processing power.

The reveal of an overkill 12GB framebuffer also offers up more clues as to the technical make-up of the card. Short of any memory-partitioning shenanigans, it's almost certain the memory bandwidth will increase significantly compared to GTX 980 with the utilisation of a 384-bit interface between the GM200 chip and the surrounding GDDR5 modules. Of more use to CUDA developers, the vast framebuffer probably won't be maxed out by any gaming applications for years to come. That said, in testing we recently carried out for our forthcoming GTX 970 RAM investigation, Assassin's Creed Unity with 8x MSAA at 2560x1440 could tap out the full 6GB allocation of the existing Titan (albeit with single-digit frame-rates). Regardless, in an era where nobody seems to know for sure how much memory is required to future-proof a current GPU purchase, a full 12GB of RAM is the equivalent of taking off and nuking the problem from orbit.

the sheer size of GM200 guarantees only a modest yield of perfect chips. That suggests that Nvidia will almost certainly be sitting on a large cache of GM200 chips that may not make the grade as Titan X processors, but will find a use elsewhere.

The obvious use for these less-than-perfect processors is to disable CUDA cores on the defective areas, pair the chip with less GDDR5 memory and release it as a cut-down graphics card - this is exactly what happened with 2013's GK110 processor, where the higher-grade chips powered GTX Titan and Nvidia's compute-based Tesla products, with the rest of the yield used for the slightly less capable GTX 780. In the case of the Titan X, the question is really how long Nvidia wants to reserve the GM200 technology for the high-end premium market. Our guess? That'll all depend on the power level of AMD's forthcoming replacements to the R9 290 and 290X
 
That's a lot of words to say we don't know anything except 12GB of ram so here's a bunch of generic truths about semiconductor industry.
 

Durante

Member
You can't? Just play an older game or lower settings.
I was recently running Trails in the Sky at 10240x5760, just to see if I could. Only got frame drops with smoke effects on large parts of the screen.

...well, that was OT. But 28x 1080p!
 

BatSu

Member
900x900px-LL-076935df_Titan.PNG


http://www.techpowerup.com/gpudb/2632/geforce-gtx-titan-x.html
 
I was recently running Trails in the Sky at 10240x5760, just to see if I could. Only got frame drops with smoke effects on large parts of the screen.

...well, that was OT. But 28x 1080p!

What GPU are you running, if I may ask?


Anyway, at $1350 these are gonna be a tough sell for me. I like to have the latest and greatest but my 980s are doing me well at this point. I'd rather wait and see if we get a cut down Titan X under $1000 with 6gb, or wait even a bit more and see if they come out with an 8gb card with a 512bit mem bus.

WTF am I gonna do with 12gb.... Nvidia pls :(
 

Momentary

Banned
A $1350 graphics card is gonna be a tough sell for anyone. If they price it that much it's just because they can. The thing probably costs NVIDIA probably less than 600 dollars to make. Maybe even less than that.

The last numbers I remember seeing had the GTX580 prices at $499 with the cost to manufacture one being like $205. That's more than a 100% mark up.

So it probably around 500-600 to manufacture. They could sell it at 900 and still make bank. Maybe even more.

But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.
 
D

Deleted member 80556

Unconfirmed Member
But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.

This always gets me happy. Not because people might be dumb and they might change it next year, but because this actually helps with the development of new GPU's, helping with the development of super computers, thus helping in science and some other cool stuff. Thanks, people with lots of money!
 
A $1350 graphics card is gonna be a tough sell for anyone. If they price it that much it's just because they can. The thing probably costs NVIDIA probably less than 600 dollars to make. Maybe even less than that.

The last numbers I remember seeing had the GTX580 prices at $499 with the cost to manufacture one being like $205. That's more than a 100% mark up.

So it probably around 500-600 to manufacture. They could sell it at 900 and still make bank. Maybe even more.

But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.

Yeah I'm actually starting to get cold feet on this card now. At first I was totally on board but that price is astronomical high.

I have no issue spending some cash on a good GPU. When I bought my 780ti it was $700 but dude, $1350 is freaking crazy.
 

badb0y

Member
A $1350 graphics card is gonna be a tough sell for anyone. If they price it that much it's just because they can. The thing probably costs NVIDIA probably less than 600 dollars to make. Maybe even less than that.

The last numbers I remember seeing had the GTX580 prices at $499 with the cost to manufacture one being like $205. That's more than a 100% mark up.

So it probably around 500-600 to manufacture. They could sell it at 900 and still make bank. Maybe even more.

But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.

$600 to produce? Back in 2009 this card would release for no more than $500-550. This is nVidia milking the idiot consumer base. I mean the writing was on the wall when they launched the first Titan and it sold really well at the fair price of $999.99 which also turned out to be a cutdown part lol.
 

Momentary

Banned
$600 to produce? Back in 2009 this card would release for no more than $500-550. This is nVidia milking the idiot consumer base. I mean the writing was on the wall when they launched the first Titan and it sold really well at the fair price of $999.99 which also turned out to be a cutdown part lol.

How much was that bad ass 8800 when it was first released? That's a damn mythical beast that trumped last gen consoles until the last day of their existence. And yes, I feel like 500-600 is solid. I'd really like to hope that these cards don't cost $300 to build. I mean it's fucked up already, but that's super fucked up.
 

Spazznid

Member
Hah, that is possible. BUt was mainly talking about awaiting the day where something like Crysis 3 with 4XMSAA is doable @ 4k 120.

That's a bit greedy.

I can play most of the games I like at 120 and with a Titan X I'd really hit 4k and beyond. If still never use msaa.
 
Top Bottom