7990 6GB would be better choice after during some researchhttp://www.anandtech.com/bench/product/1072?vs=1041
Yes, and the Nvidia ecosystem is preferably too.
7990 6GB would be better choice after during some researchhttp://www.anandtech.com/bench/product/1072?vs=1041
Yes, and the Nvidia ecosystem is preferably too.
Performance-wise they might be inferior but price/performance ratio AMD will probably be better again, right? That ratio is a lot more important to me than pure performance. I'll probably upgrade around 3Q 2015.
7990 6GB would be better choice after during some research
7990 6GB would be better choice after during some research
nope7990 6GB would be better choice after during some research
7990 is a dual gpu card meaning that 6GB is split between the 2 chips I believe. Effectively 3GB, unless I'm wrong.
For years now I have opted for the PC version over console whenever the option existed. I think this game might be the first in a while where I go with the console option. These requirements just make me feel like this will be a poorly optimized PC port.
Can someone convince me to buy this for PC instead of PS4?
7990 is a dual gpu card meaning that 6GB is split between the 2 chips I believe. Effectively 3GB, unless I'm wrong.
For years now I have opted for the PC version over console whenever the option existed. I think this game might be the first in a while where I go with the console option. These requirements just make me feel like this will be a poorly optimized PC port.
Can someone convince me to buy this for PC instead of PS4?
7990 is a dual gpu card meaning that 6GB is split between the 2 chips I believe. Effectively 3GB, unless I'm wrong.
The textures where you need to more than 4GB of VRAM for are likely not even showcases in the screenshots out there now.
Sadly, it wasn't obvious to a lot of people here. For example, legacyzero got rekt by listening to the 2gb is enough crowd. Even 3 isn't enough.
So the highest VRAM card(single core) is the 280X?
Yep.
And to think that I bought my 2GB 670 with "Future-Proofing" in mind.
FUCK.
for AMD or what do you mean?
Highest VRAM card is the TItan, 780 6gb version, or titan black. The 700 series on average has 2 gb with chances at 4 and the 900 series is all currently 4gb.
Highest AMD would be the 290 and 290x with 4GB.
For years now I have opted for the PC version over console whenever the option existed. I think this game might be the first in a while where I go with the console option. These requirements just make me feel like this will be a poorly optimized PC port.
Can someone convince me to buy this for PC instead of PS4?
For years now I have opted for the PC version over console whenever the option existed. I think this game might be the first in a while where I go with the console option. These requirements just make me feel like this will be a poorly optimized PC port.
Can someone convince me to buy this for PC instead of PS4?
for AMD or what do you mean?
Highest VRAM card is the TItan, 780 6gb version, or titan black. The 700 series on average has 2 gb with chances at 4 and the 900 series is all currently 4gb.
Highest AMD would be the 290 and 290x with 4GB.
nope
Yep.
The fact that console games pretty much run at medium settings with bad LODs and filtering at 30fps. What is your GPU and CPU? That helps more. Just because this game offers an ultra quality that perhaps requires 6GB of VRAM does not mean the console versions are anywhere near that. It infact means the opposite most likely in some way.
No, you can't look at a picture and think: "These graphics look like X VRAM."
There are many aspects of graphics that have a significantly lesser impact on VRAM. Texture quality, a very significant one is not particularly well-showcased in that screenshot.
Fuck Nvidia! I'm kidding but seriously I like the Raptr client, I like AMD Evolved, I've never had an AMD card breakdown or anything and I've put them through hell. I'm sure they'll deliver something for me in '15.It depends on the price points and timing. For example Nvidias new cards blow the AMD counterparts out of the water in price performance. But if you are looking at something less expensive than a GTX 970 you may still be well off with an AMD card. That is, until a GTX 960 appears and that it likely is the better choice.
I'd favor Nvidia.
Are you in the States? See if you can sell it on Craigslist or something. I think CL is worldwide anyway. Then buy that 4gig 900 series for $300. Actually it might be better to just wait till 2015 and get a 6-8gb card.Yep.
And to think that I bought my 2GB 670 with "Future-Proofing" in mind.
FUCK.
So the highest VRAM card(single core) is the 280X?
GTX 680 2GB and i5-2400.
No, 8 Gigs on some custom R9 290X' like Sapphires R9 290X Vapor-X (which was limited to 250 cards though iirc).
Isn't the Titan a dual GPU?
Turn that 6 into a 9 and your problems will be solved.
for AMD or what do you mean?
780 6gb version
Isn't the Titan a dual GPU?
GTX 680 2GB and i5-2400.
No
The Titan Z is a dual GPU, but the Titan and Titan Black are single GPU cards.
The Titan Z has dual GPUs. The original Titan released in February 2013 and the Titan Black released late last year are single-GPU cards.
Are you in the States? See if you can sell it on Craigslist or something. I think CL is worldwide anyway. Then buy that 4gig 900 series for $300. Actually it might be better to just wait till 2015 and get a 6-8gb card.
Your CPU and GPU are a good deal more powerful than the PS4. Why not just wait till the game comes out and people report what the settings do and how they affect performance? In anycase, you will have a higher performance baseline than the ps4 and in worst case, you have to contend with slightly worse textures.
Oh come on now. You're fine.
I'll want to see (a) screenshot comparisons of High vs. Ultra and (b) benchmarks at High and Ultra.
I suspect that High and Ultra textures will look similar to each other in the eyes of most of us. Ultra's clearly meant for enthusiasts and for future proofing. Shadow of Mordor should still look very good on High, likely better than PS4.
Some of you need to chill. Your shiny new GTX 970 isn't obsolete.
More open world and higher resolution textures = high VRAM requirements. There were already end of generation games that were easily hitting the cap. I'm far from a genius on the top, but it was obvious that VRAM requirements were going to double or triple.
Smart move. Anyone kicking themselves over this should've seen it coming rather than convincing themselves that 2-4 gigs was going to max out next-gen games. It's just going to keep going down this road. Can't wait to see what PC performance and IQ looks like a couple of years from now. When is the whole "unified memory" thing happening again?
Did this thread really turn into a 15 page thread about people bitching about their old video cards becoming obsolete?
Come on PC guys, act like you've been there before.
No? What is that supposed to mean? Don't insult me because I called you out on your ignorant comment. You suggested that people complaining about the VRAM requirement have old cards, when the reality is that some people who have brand new cards are also amongst those complaining because they also don't have the necessary memory.
Did not think that my 3GB 780 Ti's would meet their match so soon. I just hope I can pull off another year or so withought running into VRAM problems in most games.
When you mean OLD do you mean like one week old cards like the GTX 980/970 right?
Guys remember when Call of Duty: Ghosts for PC will NOT run unless you meet the mandatory 6GB minimum requirement but they patched that shit out.
We're in the same boat.. we both have 780ti. The best case for is to go down one notch to the 780 6GB but how much of a performance drop would that be?
We're in the same boat.. we both have 780ti. The best case for is to go down one notch to the 780 6GB but how much of a performance drop would that be?
GTA V 12GB Vram needed, 1.5GB system ram, and 12 core processor INCOMING!
So wait, Nvidia might just skip 20nm and go straight to 16? (from the article)My 2GB 690s had a good run.
To everyone considering upgrading, I would wait until 20nm or lower is released next year.