suntoryTime
Neo Member
3072 cuda core ?
I really hoped way more![]()
The process tech hasn't changed, 3072 is about all you could expect at 28nm.
3072 cuda core ?
I really hoped way more![]()
So this titan will generate a lot more heat than my GTX 980 right?
I cant imagine SLI scenarios, the air coming out the back of my case is so hot already with just a 980 & i7 4790K stock (havent overclocked since nothing has needed it)
But can it run FFXIV at 4k locked 60 FPS...during prime time hunts?
you are clueless
LOL
NSFW
Now I have seen everything.
A Titan X is probably right around 7x-8x more powerful than the 7850-class GPU in the PS4. It's pretty simple math. It's not like a Titan X can't pull off the tricks the PS4 does. smh
Your tag speaks volumes. Try 2.5-3.5cA Titan X is probably right around 7x-8x more powerful than the 7850-class GPU in the PS4. It's pretty simple math. It's not like a Titan X can't pull off the tricks the PS4 does. smh
How is that still on there lolUm, the last comment. NSFW warning
What is it?How is that still on there lol
What is it?
I like this Seal come back.This gif is going places
The whole dreary sky would seem to fit into a Seal video.That is one of the best GIFs I've ever seen. Holy shit.
Might be released before Witcher 3?-Q2 release (April to June)
I seem to notice most here aren't reacting to that.lololol if the bolded is true
now people can't even justify the stupid price by saying it's a quadro alternative
Also getting a 384bit memory bus for 1300 dollars? They really have no shame
anyhow, it's called titan which means it's going to be horribly overpriced.
Don't support this, wait for the 1080 or 390x and buy either of those
http://en.wikipedia.org/wiki/Die_shrinkIt's interesting to note that 1.5 years after the PS4 launched that what will likely be a $1000 videocard is only 50% more powerful (8GB RAM vs 12GB here). This is why I find it difficult to start PC gaming despite the usually lower game prices.
How faster should we expect this Titan to be compared to the Titan Black ?
40% ? Less than that ?
Your tag speaks volumes. Try 2.5-3.5c
Um, the last comment. NSFW warning
Titan X is rumored to have 7Tflops, isn't it? If so it's more like 4 times+. The PS4 GPU has 1.84Gflops + Nvidia GPUs tend to have less processing power, relatively speaking (see GTX980's ~4.6Tflops vs. R290X's ~5.6).
between 40 and 50% faster than a stock 980
Sounds pretty good to me especially considering it's still 28nm. But I won't go as far as to say it's worth 1350€/$.
I suppose the audience eyeing this card are not interested in raw perf/price.
Titan X is rumored to have 7Tflops, isn't it? If so it's more like 4 times+. The PS4 GPU has 1.84Gflops + Nvidia GPUs tend to have less processing power, relatively speaking (see GTX980's ~4.6Tflops vs. R290X's ~5.6).
thats still not 4+ and theoretical flops dont = real world performance.
Hmmm... Titan X could be the first card to provide a generational leap over PS4 graphics, since I recall the difference in power needs to be 6-8x for the leap feel like a generation, provided I continue to run it in 1080P.
Of course I don't know if there are any recent games that would attempt sufficiently advanced things with their graphics, or whether the processing power would just be wasted in stupidly high framerates.
But obviously, in terms of processing power, it is vastly more powerful than the PS4 GPU (a 7850 with 5.5/6GB GDDR5?)
The PS4 GPU can't use 6gb for graphics. Remember : 6gb is the total RAM available, to be shared between GPU and CPU.
Huh? Of course that's 4+... Nvidia tends to perform like ~30% better if you just go by Flops, i.e. 1.84 Tflops on an AMD card equal like 1.4 on a Nvidia card. That's like ~5 times, maybe more. It's true that flops don't necessarily equal real world performance, but than again they are a pretty good indicator. GTX980 for example has significantly more than twice the performance of R9 270. Now add ~50% on the GTX980 performance and take into consideration that the PS4 GPU is almost 25% slower than the R9 270 and I don't see why 4 times would be outrageous.
my god people here just cant do basic math
the closest pc gpu to whats in the ps4 is the r7 265
http://www.computerbase.de/2015-01/nvidia-geforce-gtx-960-im-test/3/
http://www.techpowerup.com/reviews/ASUS/GTX_980_Matrix/28.html
a 980 is roughly 2.4x an r7265. assuming an optimistic 50% perf improvement over a 980, a titan x will top out at 3.7x faster than a 265. thats not 4+, its even further from that guys claim of 7 to 8x, and just worlds away from the roughly 16x perf needed to run the order at 4k/120 fps as claimed by the same guy
I await this day.
its on one of the previous 2 pages dude
I was saying I want to be able to play games at 4k @ 120fps... not doubting your paragraph and claims.
Depends. With a shared memory pool you might also use less memory overall as you are not duplicating data for the GPU to use as you might do in a PC.No, it has 8GB total memory, and iirc 2/2.5GB used for the OS. This usage is likely to shrink as well.
EDIT: Sorry you said CPU. Tbh I'm not sure how the CPU effects that memory pool.
I was saying I want to be able to play games at 4k @ 120fps... not doubting your paragraph and claims.
You can't? Just play an older game or lower settings.
Right now, not much is known with absolute certainty about how powerful the Titan X is - but it was the GPU of choice for VR demos at last week's GDC 2015. Crytek used it for the Back to Dinosaur Island demo, while Epic showcased WETA Digital's Thief in the Shadows and its own Showdown demo with the new technology. We can reasonably assume that it's more powerful than the current top dog, the GeForce GTX 980, but to what degree?
Actual figures on the technical make-up of the card are limited right now - full disclosure is planned for Nvidia's own GTC event a couple of weeks from now. Nvidia CEO Jen-Hsun Huang revealed the card at Epic's Unreal Engine 4 keynote at GDC last Wednesday, giving away just two facts about the product - firstly that it has 12GB of memory, and secondly an eight billion transistor count. Subsequently we learned that the new chip at the heart of the card is called GM200 - effectively confirming 28nm Maxwell architecture similar to the existing GTX 980, making this the true successor to the original Titan's Kepler-based GK110 processor.
However, in terms of actual GPU performance, all we have to go on is the eight billion transistor figure. With the 28nm process and Maxwell architecture all but confirmed, we can compare the transistor count with the GTX 980 to give us some ballpark idea of how much faster the new card is - after all, the vast majority of the extra space on the larger chip will be used to house extra CUDA processing cores. And that's where things get exciting, as potentially we're looking at something in the region of an extra 50 per cent of processing power.
The reveal of an overkill 12GB framebuffer also offers up more clues as to the technical make-up of the card. Short of any memory-partitioning shenanigans, it's almost certain the memory bandwidth will increase significantly compared to GTX 980 with the utilisation of a 384-bit interface between the GM200 chip and the surrounding GDDR5 modules. Of more use to CUDA developers, the vast framebuffer probably won't be maxed out by any gaming applications for years to come. That said, in testing we recently carried out for our forthcoming GTX 970 RAM investigation, Assassin's Creed Unity with 8x MSAA at 2560x1440 could tap out the full 6GB allocation of the existing Titan (albeit with single-digit frame-rates). Regardless, in an era where nobody seems to know for sure how much memory is required to future-proof a current GPU purchase, a full 12GB of RAM is the equivalent of taking off and nuking the problem from orbit.
the sheer size of GM200 guarantees only a modest yield of perfect chips. That suggests that Nvidia will almost certainly be sitting on a large cache of GM200 chips that may not make the grade as Titan X processors, but will find a use elsewhere.
The obvious use for these less-than-perfect processors is to disable CUDA cores on the defective areas, pair the chip with less GDDR5 memory and release it as a cut-down graphics card - this is exactly what happened with 2013's GK110 processor, where the higher-grade chips powered GTX Titan and Nvidia's compute-based Tesla products, with the rest of the yield used for the slightly less capable GTX 780. In the case of the Titan X, the question is really how long Nvidia wants to reserve the GM200 technology for the high-end premium market. Our guess? That'll all depend on the power level of AMD's forthcoming replacements to the R9 290 and 290X
I was recently running Trails in the Sky at 10240x5760, just to see if I could. Only got frame drops with smoke effects on large parts of the screen.You can't? Just play an older game or lower settings.
Gotta get that supersampling.I was recently running Trails in the Sky at 10240x5760, just to see if I could. Only got frame drops with smoke effects on large parts of the screen.
...well, that was OT. But 28x 1080p!
I was recently running Trails in the Sky at 10240x5760, just to see if I could. Only got frame drops with smoke effects on large parts of the screen.
...well, that was OT. But 28x 1080p!
But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.
A $1350 graphics card is gonna be a tough sell for anyone. If they price it that much it's just because they can. The thing probably costs NVIDIA probably less than 600 dollars to make. Maybe even less than that.
The last numbers I remember seeing had the GTX580 prices at $499 with the cost to manufacture one being like $205. That's more than a 100% mark up.
So it probably around 500-600 to manufacture. They could sell it at 900 and still make bank. Maybe even more.
But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.
A $1350 graphics card is gonna be a tough sell for anyone. If they price it that much it's just because they can. The thing probably costs NVIDIA probably less than 600 dollars to make. Maybe even less than that.
The last numbers I remember seeing had the GTX580 prices at $499 with the cost to manufacture one being like $205. That's more than a 100% mark up.
So it probably around 500-600 to manufacture. They could sell it at 900 and still make bank. Maybe even more.
But who am I kidding. This shit is probably going to sell out before you can even hit the buy button on Newegg.
$600 to produce? Back in 2009 this card would release for no more than $500-550. This is nVidia milking the idiot consumer base. I mean the writing was on the wall when they launched the first Titan and it sold really well at the fair price of $999.99 which also turned out to be a cutdown part lol.
Hah, that is possible. BUt was mainly talking about awaiting the day where something like Crysis 3 with 4XMSAA is doable @ 4k 120.