• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Titan X announced, 12GB framebuffer, 8 billion transistors.

Ya'll know that AMD is not the way to go if you do anything other than game on your card, right?

For instance, Blender Cycles and Octane Render (and the Octane for Blender plugin) don't support AMD officially. That super sucks and puts people like me out of camp red entirely, value be damned.

12GB gpus are what the VFX industry has been waiting for. The X is like a M6000 except way cheaper. That's a pretty good value if you're a vfx hobbyist.
 

viveks86

Member
A new challenger appears? Man this is exciting!

094647nccylhss4ixsczdu.jpg


094647nccylhss4ixsczdu.jpg


http://wccftech.com/amd-r9-390x-nvidia-gtx-980ti-titanx-benchmarks/
 

riflen

Member
It would be hilarious if this cards hits the streets at $800. Online Tech stores would slow to a crawl on release date for this.

I very much doubt Titan X will list for less than 999. A retailer in the UK let slip that they have 1000 units to sell. That number will sell out in a blink of an eye at that price. There's no reason for Nvidia to price lower.
 

ZOONAMI

Junior Member
I could really care less who likes what. It's the fact that you don't know how to group quotes and edit your posts accordingly to make yourself not look so rabid while also turning this into some kind of versus thread when all I'm trying to come here for is updates on benchmarks and specs.

Anyway.

If this Chiphell chart for BF4 is real.. then I'm actually pretty impressed with the performance gain from the 980. This is going to be great for those looking for a devastating small form factor setup. I'm excited. It's going to look good in a Lian-Li PC-O6 or RVZ02.

GTX-TITAN-X-Chiphell-1.png


It would be hilarious if this cards hits the streets at $800. Online Tech stores would slow to a crawl on release date for this.

I pretty much only post with my phone, so sorry? And my edits are pretty much only for spelling/grammar.

And as far as sff, my rig is a hadron air with a 4690k and 290x, running a Samsung 28" 4k monitor, so I am interested in only a single GPU. If nvidia would be honest enough to admit they lied and apologize, and change the damn specs on their website for the 970, I would probably buy a titan x.
 

ZOONAMI

Junior Member
Just like many of us are ok with MS after the initial xbox one plans or Sony after their killzone 2 debacle. Companies do shitty things all the time out of short sightedness, overambition and greed. It is in their very nature and what makes them competitive. As consumers, we raise issues and put them in place when they fuck up. The 970 issue will be pressed by those affected and Nvidia will be put in their place. Doesn't mean the whole world needs to hold a grudge. We move on.

A lot more time has passed in your examples than the 970 situation. Microsoft has actually changed their behavior and Xbox strategy because of public feedback. Nvidia has done literally nothing other than say, "our marketing people and and engineers weren't communiticating." So we're supposed to believe the engineers were not aware of the published specs, and didn't read any press about the card they just released?
 
https://youtu.be/6y_GEvObFRQ

Seems pretty comparable to me. Usually within 5 frames and in some games outperforms the 980 at 4k. With the newest omega drivers they are pretty much on par for the newsest games at 4k (video is from last sept).

the video you just linked shows a near constant 20% to 25% perf advantage for the 980 in everything except C3 and TR. that advantage only grows as you oc both chips. perf has also improved on 980 since those were launch drivers.

http://www.hardocp.com/article/2015..._980_platinum_video_card_review/#.VQQs6uFfB8E

its really not even close. the water cooling only afford them 80 extra mhz over what most cards do on air so just subtract 5% from each overclocked 980 result.

http://www.hardocp.com/article/2015/02/17/gigabyte_gtx_980_g1_gaming_video_card_review/#.VQQti-FfB8E

this one has comparisons between factory oc versions so perf can be compared at sub max oc levels
 

ZOONAMI

Junior Member
the video you just linked shows a near constant 20% to 25% perf advantage for the 980 in everything except C3. that advantage only grows as you oc both chips.

If it was a constant ~10 frames your 20%-25% would actually make sense. But it isn't, so no. 10-15%, yes. Given that a 980 is 2x as expensive as a 290x, and a year newer, not worth it to me. And as stated, the newest omega drivers close the gap. There is also an 8gb 290x for $100-$200 less than a 980 that does even better at 4k. But sure, just make up percentages. All I'm saying is that they are fairly close in performance, outside of heat and wattage.

Edit: The benchmarks you added aren't 4k, where the 512 bit bus of the 290x really starts to become valuable. Also, that Poseidon gtx 980 is like a $700 card. For that much you could buy a 295x2.
 
If it was a constant ~10 frames your 20%-25% would actually make sense. But it isn't, so no. 10-15%, yes. Given that a 980 is 2x as expensive as a 290x, and a year newer, not worth it to me. And as stated, the newest omega drivers close the gap. There is also an 8gb 290x for $100-$200 less than a 980 that does even better at 4k. But sure, just make up percentages. All I'm saying is that they are fairly close in performance, outside of heat and wattage.

Edit: The benchmarks you added aren't 4k, where the 512 bit bus of the 290x really starts to become valuable. Also, that Poseidon gtx 980 is like a $700 card. For that much you could buy a 295x2.

please learn math and then check the 2 links i supplied. who cares about 4k all gpus are shit and unplayable at 4k in anything halfway decent graphically.
 

ZOONAMI

Junior Member
please learn math and then check the 2 links i supplied. who cares about 4k all gpus are shit and unplayable at 4k in anything halfway decent graphically.

I care about 4k because I have a 4k monitor. And if 30-50 fps is unplayable, I guess I should stop playing games.

I alreadly looked at your 2 links and addressed them. Seems to me you have the math problem. Still not seeing more than a ten frame advantage, with higher averages because this is at 1440p. Most of these averages are beyond 60 frames for both nvidia, so you're just helping to prove the 290x is a very good card. Last time I checked, 10/80 is not 20-25%, but 12.5%. I said 10-15% in my previous post for the 980 in some games. In some games the 290x actually does better. Again, the only thing I'm saying is that they are fairly comparable in performance, outside of heat and wattage. I don't see why you have a problem with that.
 
If it was a constant ~10 frames your 20%-25% would actually make sense. But it isn't, so no. 10-15%, yes. Given that a 980 is 2x as expensive as a 290x, and a year newer, not worth it to me. And as stated, the newest omega drivers close the gap. There is also an 8gb 290x for $100-$200 less than a 980 that does even better at 4k. But sure, just make up percentages. All I'm saying is that they are fairly close in performance, outside of heat and wattage.

Edit: The benchmarks you added aren't 4k, where the 512 bit bus of the 290x really starts to become valuable. Also, that Poseidon gtx 980 is like a $700 card. For that much you could buy a 295x2.

I really hope that everyone who says that really buys 295x2 so he can enjoy all that dual gpu magic on his own :D
 

ZOONAMI

Junior Member
I really hope that everyone who says that really buys 295x2 so he can enjoy all that dual gpu magic on his own :D

Well, it's a good thing for AMD that he's bringing up a ~$700 water cooled 980 that isn't even in stock anywhere to prove a 980 can outperform a 290x by 10-15%.
 
I care about 4k because I have a 4k monitor. And if 30-50 fps is unplayable, I guess I should stop playing games.

I alreadly looked at your 2 links and addressed them. Seems to me you have the math problem. Still not seeing more than a ten frame advantage, with higher averages because this is at 1440p. Most of these averages are beyond 60 frames for both nvidia, so you're just helping to prove the 290x is a very good card. Last time I checked, 10/80 is not 20-25%, but 12.5%. I said 10-15% in my previous post for the 980 in some games. In some games the 290x actually does better. Again, the only thing I'm saying is that they are fairly comparable in performance, outside of heat and wattage. I don't see why you have a problem with that.

30 to 50 fps in most games does suck. just a constant juddery mess. very few games feel smooth at a variable fps jumping around between 30 and 50. you also are relegated to medium settings to even attain that.

as for your video:
-bf4 : typically 8 fps lead where the 290x is constantly jumping all around between 30 and 44 fps.
-metro : typically 8 to 10 fps lead where the 290x is mostly between 20 and 30 fps
-bioshock : typically 10 to 20 fps lead where the 290x is mostly between 40 and 50 fps

do you even math?
 

ZOONAMI

Junior Member
30 to 50 fps in most games does suck. just a constant juddery mess. very few games feel smooth at a variable fps jumping around between 30 and 50. you also are relegated to medium settings to even attain that.

as for your video:
-bf4 : typically 8 fps lead where the 290x is constantly jumping all around between 30 and 44 fps.
-metro : typically 8 to 10 fps lead where the 290x is mostly between 20 and 30 fps
-bioshock : typically 10 to 20 fps lead where the 290x is mostly between 40 and 50 fps

do you even math?

You're ignoring 2/5 of the games in the video, and you're inflating the fps advantage of the 980 in the other games. The 980 also has a variable frame rate that's jumping all around. And again, AMD drivers have improved performance more for the 290x since September than nvidia drivers have for the 980.

I enjoy playing my games at 4k, even though they aren't a locked 60fps. I imagine there are other people out there with 4k monitors who feel the same.

I'm done talking about the 290x versus 980 - it should be clear to most folks that a 290x offers comparable performance, for half the cost. If you aren't concerned about 50 more watts and a bit higher temps, it's a good way to go.
 
You're ignoring 2/5 of the games in the video, and you're inflating the fps advantage of the 980 in the other games. The 980 also has a variable frame rate that's jumping all around.

I enjoy playing my games at 4k, even though they aren't a locked 60fps. I imagine there are other people out there with 4k monitors who feel the same.

I'm done talking about the 290x versus 980 - it should be clear to most folks that a 290x offers comparable performance, for half the cost. If you aren't concerned about 50 more watts and a bit higher temps, it's a good way to go.

just went to the DF page that video is from

bf4 - 19%
metro - 20%
bioshock - 31%.

yep my numbers are clearly inflated. math is fun. these are also at 4k, where that 512 bit bus is suppose to be laying the smack down
 

dr_rus

Member
You're ignoring 2/5 of the games in the video, and you're inflating the fps advantage of the 980 in the other games. The 980 also has a variable frame rate that's jumping all around. And again, AMD drivers have improved performance more for the 290x since September than nvidia drivers have for the 980.

I enjoy playing my games at 4k, even though they aren't a locked 60fps. I imagine there are other people out there with 4k monitors who feel the same.

I'm done talking about the 290x versus 980 - it should be clear to most folks that a 290x offers comparable performance, for half the cost. If you aren't concerned about 50 more watts and a bit higher temps, it's a good way to go.

What? Really?

If you mean 970 vs 290X then you're right to some extent. But then you'll have to consider that 290X owners will miss out on some tech which is/will be coming to 970. This has always been the case between AMD and NV GPUs.
 

Momentary

Banned
I hate argument s like this, it wasn't funny 8 years ago and it's not funny now. Condoles have exclusives, we get it.

It think it's more of a joke about it being the new benchmark in graphics with it's vaseline resolution, black bars, and closed corridor design.
 

ZOONAMI

Junior Member
just went to the DF page that video is from

bf4 - 19%
metro - 20%
bioshock - 31%.

yep my numbers are clearly inflated. math is fun. these are also at 4k, where that 512 bit bus is suppose to be laying the smack down

You must be terrible at math http://www.eurogamer.net/articles/digitalfoundry-2014-nvidia-geforce-gtx-970-review

3840x2160 (4K) GTX 970 GTX 970 (OC) GTX 980 GTX 780 Ti R9 290X R9 290GTX780
BioShock Infinite 50.3 58.2 57.4 50.7 43.7 40.6 39.9
Tomb Raider 35.1 40.5 39.5 43.3 40.1 37.1 34.1
Battlefield 4, High 40.9 46.8 46.2 41.9 38.9 36.7 35.8
Metro: Last Light 32.8 37.7 36.4 33.1 30.4 28.6 27.6
Crysis 3, High 32.0 37.2 34.2 33.4 35.2 32.8 28.1


Bioshock - 980 has a 24% advantage
BF4 - 16%
Metro - 17%

Crysis 3 and Tomb Raider 290x beats the 980. (3% and 2% respectively).

If you average all 5 games out you get a 10.4% advantage for the 980. So yes, you are inflating things.
 

BeEatNU

WORLDSTAAAAAAR
This banter is starting to be abit ridiculous.
Every time this is bumped im getting hyped some legit information about the Titan X is posted. Not its lil bro 980
 
You must be terrible at math http://www.eurogamer.net/articles/digitalfoundry-2014-nvidia-geforce-gtx-970-review

3840x2160 (4K) GTX 970 GTX 970 (OC) GTX 980 GTX 780 Ti R9 290X R9 290GTX780
BioShock Infinite 50.3 58.2 57.4 50.7 43.7 40.6 39.9
Tomb Raider 35.1 40.5 39.5 43.3 40.1 37.1 34.1
Battlefield 4, High 40.9 46.8 46.2 41.9 38.9 36.7 35.8
Metro: Last Light 32.8 37.7 36.4 33.1 30.4 28.6 27.6
Crysis 3, High 32.0 37.2 34.2 33.4 35.2 32.8 28.1


Bioshock - 980 has a 24% advantage
BF4 - 16%
Metro - 17%

Crysis 3 and Tomb Raider 290x beats the 980. (3% and 2% respectively).

If you average all 5 games out you get a 10.4% advantage for the 980. So yes, you are inflating things.

is this real life???
 

viveks86

Member
FFS you guys. Can we stop with the 290 vs 970/980 already? If you want to compare benchmarks, at least compare them to the titan x. If not, please find another thread to derail.
 

ZOONAMI

Junior Member
FFS you guys. Can we stop with the 290 vs 970/980 already? If you want to compare benchmarks, at least compare them to the titan x. If not, please find another thread to derail.

Yeah I'm done. There is an inherent comparison to the titan x given that it is supposed to offer 36% more performance than a 980 though.
 

ZOONAMI

Junior Member
actually its not. math doesnt work in real life how it apparently works in your reality.

Oh my bad, it's too early. Sorry. I see what I did there - dividing instead of subtracting and then dividing. The numbers I had would be like the 290x offers 76% of the performance of a 980 in Bioshock Infinite lol. Still though, it does do better than the 980 in some games, and is still pretty close in most games. For the money, it's a great card. Going back to sleep now lol.
 

tuxfool

Banned
Ya'll know that AMD is not the way to go if you do anything other than game on your card, right?

For instance, Blender Cycles and Octane Render (and the Octane for Blender plugin) don't support AMD officially. That super sucks and puts people like me out of camp red entirely, value be damned.

12GB gpus are what the VFX industry has been waiting for. The X is like a M6000 except way cheaper. That's a pretty good value if you're a vfx hobbyist.

The VFX industry is already using those 12GB Quadros. They won't change to the TitanX unless it is certified to work with the programs they use (which check whether you're using a professional card). AFAIK the original titan was in that class, but the rumors that it doesn't support fp64 makes me wonder if they are targeting it as a premium gaming card (then again 12GB for gaming is really overkill).
 

tuxfool

Banned
Damn this thing looks good internally.

NVIDIA-GeForce-GTX-TITAN-X-7-900x585.jpg


It's just begging to have an EK Waterblock slapped on it.

Looks like any other high end graphics card with the exception of that massive gpu.

Also that VRM only appears to have 6 phases for the GPU, which might not be pretty. It gives credence to the rumors that it is voltage locked.
 

Momentary

Banned
Looks like any other high end graphics card except for that massive die size.

Also that VRM only appears to have 6 phases for the GPU, which might not be pretty. It gives credence to the rumors that it is voltage locked.

They don't all look the same. Reference PCBs do look similar from the 780 to th 980 to the TITAN X. Just something about it having more symmetry that I like. The 980 Strix looks good to me too.
253569-4.jpg
 

tuxfool

Banned
They don't all look the same. Reference PCBs do look similar from the 780 to th 980 to the TITAN X. Just something about it having more symmetry that I like. The 980 Strix looks good to me too.

I'd have to ask, what looks bad? They all use a similar layout generally more often than not.
 
AMD fanboys confusing the best performance with price/performance budget. Nvidia cards are typically better and the drivers and software suite are better and more frequently updated.
 

Raticus79

Seek victory, not fairness
Star Citizen should be one game that can actually use the 12GB. 8K textures!

I'll be picking up two of these for VR SLI. I needed another displayport port anyway - ROG Swift + BenQ BL3201PH (4k60) and I only had one port on my Titan, so I'm just swapping connections at the moment.
 
Top Bottom