• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3 runs at 1080p ULTRA ~60 fps on a 980

Qassim

Member
What are you asking for? You want devs to spend resources on features that less than 1% of their player base can use even though they pay the exact same amount of money? This isn't research. It's business. For ultra high end gamers, it's really about cranking everything up to the max including resolution and AA, amongst other things. Don't expect features that are not even supported by most cards. Crysis did that because they sold the engine. CDPR isn't selling tech.

I'm not asking developers to include these things, I'm asking for consumers not to create an environment where developers CAN'T do it. It's a different argument.

I understand and don't complain that developers often don't include these ultra enthusiast settings. THIS IS NOT MY ISSUE. My issue is gamers basically demanding that developers don't include them because they can't handle their feelings being hurt because the settings screen isn't set to all max values on their particular piece of hardware.

I'll repeat, my problem here isn't with developers at all. My problem here is *entirely* with a type of consumer, some of which are in this thread.
 

misho8723

Banned
Without going to twitter:
YDqb.jpg
 

viveks86

Member
I'll repeat, my problem here isn't with developers at all. My problem here is *entirely* with a type of consumer, some of which are in this thread.

Thanks for clearing that up. I'm not sure this is even a big problem to deserve such a debate, then. People spending in excess of $500 for a card to play games are bound to have high, and sometimes unreasonable, expectations. Let them vent. I don't think it creates an environment for developers that prevents them from doing what they want. There aren't THAT many people.
 

Durante

Member
Thanks for clearing that up. I'm not sure this is even a big problem to deserve such a debate, then. People spending in excess of $500 for a card to play games are bound to have high, and sometimes unreasonable, expectations. Let them vent. I don't think it creates an environment for developers that prevents them from doing what they want. There aren't THAT many people.
I wouldn't be so sure about that. E.g. Dying Light had its draw distance setting range nerfed because of the "ALL MAX SETTINGS" clientele.
 

viveks86

Member
I wouldn't be so sure about that. E.g. Dying Light had its draw distance setting range nerfed because of the "ALL MAX SETTINGS" clientele.

Hmmm… I think devs choose to do that because they don't want settings that make them look bad. It's a design philosophy that exists in all kinds of software, not just games. If you aren't optimized for it, then don't bother giving it as an option. As a consumer who can afford it, I can see that being frustrating. But as a dev, it seems like the right call to make.

The solution, imo, is to make such settings available outside the game, in the form of ini files or mods, for those really interested.
 

Ooccoo

Member
I love how CD Projekt RED has been on damage control ever since they revealed actual gameplay of Wild Hunt. That's what happens when you show a CG trailer that isn't possible to reproduce with current hardware and manpower. I still think the game looks gorgeous but come on, that video was impossible to translate into actual gameplay with our technology. I still think this developer is great, the game will sell like hotcakes and the DLC approach alone beats most other devs on the market.
 

Qassim

Member
I didnt say resolution was everything. I am saying that people who want to push their hardware can/will do so by increasing resolution.

So they don't want extra graphical settings? I don't buy that at all. As an enthusiast, I'd like as many as the developer is willing to offer.

I'm not entirely sure you're getting my original point because I still don't quite understand where you're coming from.
 
I wouldn't be so sure about that. E.g. Dying Light had its draw distance setting range nerfed because of the "ALL MAX SETTINGS" clientele.

People just want to crank things to the max for no bloody good reason and then complain when it doesn't run well. There's definitely something fishy about the performance of that setting though. Way too big a hit for what it actually gives you.

Probably a good thing they lowered the in-game max. You can still crank the setting to the original max (10) via my CE table :)
 

Lulubop

Member
That's PC gaming, sadly.

No matter whatever you buy, it'll be outclassed in less than 6 months. So you better live with it, or spend lots of money each year. I'm happy with my 3.5GB 970, but it'll be like crap in a year. That's life for us PC gamers.

Lol crap, smh. is this shit for real?
 

viveks86

Member
Is everyone really that certain that their maxed settings will not match the reveal? I hope they do for 2 reasons:
  • I can push my new rig to its limits
  • I can watch all the naysayers eat crow. I mean holy shit at the number of people who seemed to have jumped to that conclusion!

Both sound equally enticing at this point. Please make it happen, CDPR!
 
Keep reading. Turns out it wasn't confirmed to be high on consoles. The opposite. They said the textures on PS4 were LOWER than high on Pc, LOD was like half of PC on high, and shaodws were of lower wuality too. Also, it's running at 30 fPS.

Consoles will run this game at a mix of low, medium and high (moslty medium), just like most multi-plats. Your particular rig should compete fine with a PS4.

You should get it on PC since the PS4 probably wont have AF.

Whatever the console settings end up being, your PC will most likely be able to match and likely slightly exceed it.


thats cool then, thanks for the info
 
That's PC gaming, sadly.

No matter whatever you buy, it'll be outclassed in less than 6 months. So you better live with it, or spend lots of money each year. I'm happy with my 3.5GB 970, but it'll be like crap in a year. That's life for us PC gamers.

Lol, crap in a year. No it won't. People just have to learn that they shouldn't want to put everything on max nor want developers to make sure the max settings run on lower specced PCs.

People also have to learn that there isn't such a thing as future proofing and buying a super expensive graphics card means you are getting relatively less performance for your money.

The Xbox One and PS4 have games that look way worse than the same games on the PC, does that mean their purchases are crap? Hell nowadays it seems that an equally powerful GPU as the one the consoles use gives you about the same performance.

Is everyone really that certain that their maxed settings will not match the reveal? I hope they do for 2 reasons:
  • I can push my new rig to its limits
  • I can watch all the naysayers eat crow. I mean holy shit at the number of people who seemed to have jumped to that conclusion!

Both sound equally enticing at this point. Please make it happen, CDPR!

Tbh, I don't even see much of difference between the carefully picked trailer shots and the non carefully picked unedited long amounts of gameplay footage.

Honestly, lightning-wise I even prefer this screenshot. Without massive amounts of contrast everywhere people are quick to say downgrade. I doubt the extra effects suddenly will make it look entirely different though, I am just assuming some Nvidia effects.
 
That's PC gaming, sadly.

No matter whatever you buy, it'll be outclassed in less than 6 months. So you better live with it, or spend lots of money each year. I'm happy with my 3.5GB 970, but it'll be like crap in a year. That's life for us PC gamers.

Dude, wat

Sure, you won't be running stuff at ultra, but It'll have a useful live for a few years. Definetly at least until the end of the console generation and a few years down the line.
 
So for this beast Intel i7-4790 with 16GB of RAM and an NVIDIA GTX980 You pay more than $/€ 1.000. Right? Even at that pricepoint you arent able to play it on locked 60fps!

Price vs. Quality is completly out of proportion.
Haha, I wish that was the case. With my 780ti and 3570k @4.2 GHz I still can't get 60 fps on many current gen games (ac unity, dead rising 3, watch dogs, and a few others). That's at 1080p with no MSAA.
 

viveks86

Member
Tbh, I don't even see much of difference between the carefully picked trailer shots and the non carefully picked unedited long amounts of gameplay footage.

Honestly, lightning-wise I even prefer this screenshot. Without massive amounts of contrast everywhere people are quick to say downgrade. I doubt the extra effects suddenly will make it look entirely different though, I am just assuming some Nvidia effects.

The biggest improvements I'm expecting are in HDAO+ and tessellation. I'm almost certain they have better ground textures behind some setting in there. All of these, in my opinions, should bring it very close to the reveal trailer. Only time will tell if they match or exceed it. But instead of waiting, we are all raising our pitchforks prematurely, as always.
 
The biggest improvements I'm expecting are in HDAO+ and tessellation. I'm almost certain they have better ground textures behind some setting in there. All of these, in my opinions, should bring it very close to the reveal trailer. Only time will tell if they match or exceed it. But instead of waiting, we are all raising our pitchforks prematurely, as always.

I would kind of expect better textures to be shown in a screenshot like this, since it wouldn't really be difficult to implement, but we'll see.

I am sure most people are thinking of much more drastic changes than you mention. Although I agree with you, those are pretty much the points that bother me right now about the screenshot.

Have my doubts about whether there will be tessellation there though.
 
Haha, I wish that was the case. With my 780ti and 3570k @4.2 GHz I still can't get 60 fps on many current gen games (ac unity, dead rising 3, watch dogs, and a few others). That's at 1080p with no MSAA.

I have the exact same setup. I got Watch Dogs to 60 FPS with TXAA enabled, but I had to tweak a few settings to get there. I expect to do the same with The Wticher 3.
 

Lunar15

Member
What's this 6 months bullshit? My computer ran every game I threw at it completely fine for 6 years. We're talking high settings, if not a mixture of high and ultra. It wasn't even a top-end PC when I built it! When I built my new one last year, I didn't have to upgrade everything, and I was able to build a pretty top-end PC for less than my first computer.

Sure, I occasionally missed out on some extra bells and whistles, but I was able to make every game I played during that 6 year period look as good, if not better than, current consoles and achieve stable framerates.
 

b0bbyJ03

Member
So you're saying a CDPR shouldn't include a ultra enthusiast texture setting (example) that requires 6GB VRAM, or some insane LoD distance settings, or some special lighting or particle effects, just because some people can't handle the idea that the settings screen for them isn't on all the max values?

If so, that's really dumb. Developers should be able to include settings that either requires multiple GPUs, or even GPUs that don't necessarily even exist yet (to run at reasonable framerate). That's one of the best things about PC gaming. Limiting available settings to what is possible on single high end GPUs available today would be a bad thing for the platform.

This is a massive open world game that is said to have over 100 hours of gameplay. I wouldn't say they limited anything. Its more like time is limiting them. everything has to have a budget and scope and they chose to go for scale while still maintaining impressive graphics. Im sure they could have made a much prettier game had they scaled down the world and I'm sure the game would be less graphically impressive if the scale was larger than what it is now. Things need to be viewed in the proper context.
 

daninthemix

Member
To be honest, as long as it has a decent motion blur implementation I'll be happy with a stable 30fps.

Ran DAI at 30, though the motion blur in that was either nonexistent or ineffective for me.
 

Daingurse

Member
Hope my GTX 970 and 980x @4.0GHZ serve me well. Don't think I'll be off LGA 1366 before this game drops and I can't overclock my CPU any further. Will play this from my bed, so I won't need much AA. Some kind of temporal post processing, or SMAA would be good enough.
 

Kezen

Banned
No conclusive evidence of that, factually speaking.

No conclusive evidence of the reverse, either. Besides, we have seen what the console versions look like : the game was shown on Xbox One.
It will probably look about the same on PS4, except for the resolution.

Not to mention PC at high settings has been showcased which is basically console settings.
 
To be honest, as long as it has a decent motion blur implementation I'll be happy with a stable 30fps.

Ran DAI at 30, though the motion blur in that was either nonexistent or ineffective for me.

DAI has no perobject blur, it was using an older version of frostbite in that regards unfortunately.

All footage from TW3 so far has had no per object blur, just a screen implementation sadly. I hope the "ultra" post processing settings include it. It does wonders to games IMO.
 

i-Lo

Member
No conclusive evidence of the reverse, either. Besides, we have seen what the console versions look like : the game was shown on Xbox One.
It will probably look about the same on PS4, except for the resolution.

Not to mention PC at high settings has been showcased which is basically console settings.

It's far prudent to be skeptical at this point given the numerous opportunities they had to show both versions publicly but didn't. And that E3 demo was a small vertical slice.

Also, is that your claim or theirs? Forgive me but I can't take that claim seriously unless the product is shown. And before someone thinks I am a troll, I have the game preordered for PS4 and it simply irks me that they have been continuously showing the PC version without even a hint of PS4 gameplay; only second hand impressions have been provided thus far for console versions (esp. PS4 version).
 

erawsd

Member
So they don't want extra graphical settings? I don't buy that at all. As an enthusiast, I'd like as many as the developer is willing to offer.

I'm not entirely sure you're getting my original point because I still don't quite understand where you're coming from.

Maybe I do misunderstand. It seemed like you were distressed that "Ultra settings" can be achieved with a single 980. My response to that is that is that you can increase resolution if you want to further stress your hardware. Resolution is not everything but it is a significant "extra graphical setting".

Beyond resolution, sure it would be nice to have other settings that can be cranked up. That is not something I would expect from an ultra preset that, presumably, has to be optimized the same as any other preset. Whether things can be tweaked beyond that in the ui or via ini is up in the air at this point. We do know that ubersampling is not part of the ultra spec, I think hairworks may even be a separate toggle.
 
Top Bottom