• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

The Witcher 3 runs at 1080p ULTRA ~60 fps on a 980

Haha, I wish that was the case. With my 780ti and 3570k @4.2 GHz I still can't get 60 fps on many current gen games (ac unity, dead rising 3, watch dogs, and a few others). That's at 1080p with no MSAA.

AC Unity is CPU bottlenecked/core dependent. 6-core and 8 rigs get almost double performance of a 4 core rig last time I checked (around launch).

Dead Rising 3 is a messy port.

Watch Dogs could be considered a bad/mediocre port.

Literally the absolute worst case scenario games for 90% of hardware setups out there. Your setup is still really good.
 

Kinthalis

Banned
It's too early for that. We do not have gameplay side by side, only accounts. No nothing has been "debunked", yet.

If it's too ealry for that which we atleast have SOME evidence, then it sure as hell is too early to be claiming that the consoles = high on PC, since the only evidence we have at the moment says the opposite!
 

viveks86

Member
I would kind of expect better textures to be shown in a screenshot like this, since it wouldn't really be difficult to implement, but we'll see.

I am sure most people are thinking of much more drastic changes than you mention. Although I agree with you, those are pretty much the points that bother me right now about the screenshot.

Have my doubts about whether there will be tessellation there though.

Tessellation as a feature is there for sure. We don't know to what extent though.

http://www.dsogaming.com/interviews/cd-projekt-red-talks-the-witcher-3-tech-tessellation-physx-dx11-2-windows-8-global-illumination/
 

RedSwirl

Junior Member
So are people who went with i7's vindicated at this point? RPS recently ran an article about how for gaming you really don't need more than four threads.

In any case, this game is going to be a trial for my i5-4670k and 2GB 760. I haven't even been able to overclock the 4670k yet. I imagine I'll probably still get better LOD than the console versions, but that's it.
 

Kezen

Banned
It's far prudent to be skeptical at this point given the numerous opportunities they had to show both versions publicly but didn't. And that E3 demo was a small vertical slice.

Also, is that your claim or theirs? Forgive me but I can't take that claim seriously unless the product is shown. And before someone thinks I am a troll, I have the game preordered for PS4 and it simply irks me that they have been continuously showing the PC version without even a hint of PS4 gameplay; only second hand impressions have been provided thus far for console versions (esp. PS4 version).
CDPR claimed consoles are using PC's high settings :
By comparison, the graphics of the console versions are equivalent to high settings on PC.

If it's too ealry for that which we atleast have SOME evidence, then it sure as hell is too early to be claiming that the consoles = high on PC, since the only evidence we have at the moment says the opposite!
At least that's what CDPR claimed, we will see if this is true but nothing I've seen of The Witcher 3's high settings footage seems "undoable" on consoles, to me.
 

Kinthalis

Banned
Right now per core performance is sooo much more important than overall number of hardware threads. But once new API's hit, that importance will be a lot more even.

Having powerful cores and a large number of them will be beneficial. Games tend to spawn a lot more than just the number of threads our CPU can "handle" in hardware. The scheduler breaks up that workload among available hardware threads. It's about the quality of work being done in thos threads that determine how efficiently the work is being done.
 

dr_rus

Member
Pffft who cares about 1080p on PC, this resolution was interesting here five or so years ago.
Now let's talk about 5120x3200...
 

b0bbyJ03

Member
CDPR claimed consoles are using PC's high settings :
By comparison, the graphics of the console versions are equivalent to high settings on PC.


At least that's what CDPR claimed, we will see if this is true but nothing I've seen of The Witcher 3's high settings footage seems "undoable" on consoles, to me.

Can you imagine how the PR would sound if they said "The console version are using medium settings"? im pretty sure this game will have some high settings but im also sure it wont be across the board. I expect consoles to have a mixture of low to high, and maybe ultra for the textures.
 

low-G

Member
Can you imagine how the PR would sound if they said "The console version are using medium settings"? im pretty sure this game will have some high settings but im also sure it wont be across the board. I expect consoles to have a mixture of low to high, and maybe ultra for the textures.

Because the world inside your head is likely more true than what developers themselves say, now?
 

M0G

Member
Well that's my rig except in SLI so I'm hoping that Ultra spec is including some reasonable AA so I don't have to downsample 4k to get rid of the obligatory jaggies. Damn I miss the days when we could just SGSSAA anything that didn't have it ingame.
 

Oh yeah, I just saw the part about characters so it wasn't clear whether they would add it in.

Well, hopefully it will look good but I can't think of a single example where I thought tessalation made a big difference for the performance. Although I hadn't had a GPU before that could utilize it well either, so we'll see.

So are people who went with i7's vindicated at this point? RPS recently ran an article about how for gaming you really don't need more than four threads.

In any case, this game is going to be a trial for my i5-4670k and 2GB 760. I haven't even been able to overclock the 4670k yet. I imagine I'll probably still get better LOD than the console versions, but that's it.

Why would you assume this is the minimum CPU you need for that? They likely aren't going to make the most minimally specced PC to run their maximum settings. I am sure it will run perfectly fine on a quad core. The bigger difference is likely with the GPU.
 

b0bbyJ03

Member
Because the world inside your head is likely more true than what developers themselves say, now?

So are you saying that its not logical for me to assume that what developers say is not always the truth? lol. ok. you go on believing everything PR tells you.
 
dat first post.

Not that it is a first post, but what is bad about having features that would be suited for future PCs?

Having a GPU like that not being able to run everything on the highest doesn't mean it is a bad port.

Witcher 2 kind of already had something like that with Ubersampling, even though it is not really much of a feature especially nowadays.
 

RedSwirl

Junior Member
Why would you assume this is the minimum CPU you need for that? They likely aren't going to make the most minimally specced PC to run their maximum settings. I am sure it will run perfectly fine on a quad core. The bigger difference is likely with the GPU.

Most technical material so far has suggested this game is actually more CPU-bound than GPU-bound.
 
Most technical material so far has suggested this game is actually more CPU-bound than GPU-bound.

Well, all the additional graphical effects are unlikely to have much of an effect on the CPU. Resolution and anti-aliasing won't have an effect at all.

I'll assume they were CPU-bounded because of the consoles because those have pretty weak CPUs. I'd doubt you need a much better CPU for the PC version with bells and whistles. You'd maybe want a higher minimum framerate, but even twice the CPU of the consoles is not very much.
 

Hendrick's

If only my penis was as big as my GamerScore!
Consoles are using PC high settings, says CDPR.

PC Settings:

High
Ultra-High
Ultra-Mega-High
Uber-Ultra-High
 
Not that it is a first post, but what is bad about having features that would be suited for future PCs?

Having a GPU like that not being able to run everything on the highest doesn't mean it is a bad port.

Witcher 2 kind of already had something like that with Ubersampling, even though it is not really much of a feature especially nowadays.

For me personally, it's pretty rare for me to go back and play an older game on PC -- the window of opportunity is about six months, then it goes into my backlog. (Which is interesting, because I thoroughly enjoy going back and playing old console games (SNES, PS2, etc.)) Future proofing a game serves no purpose for me. Basically, I want to experience all the goodness the game has to offer in the here and now, because I most likely won't come back to it at a later date. Besides, it'd be like waiting to eat a hot dog until you can put some mustard on it, but when you get around to it you find that, although the mustard is delicious, you've found the bun has gone stale. Terrible analogy, I know. Lol
 
For me personally, it's pretty rare for me to go back and play an older game on PC -- the window of opportunity is about six months, then it goes into my backlog. (Which is interesting, because I thoroughly enjoy going back and playing old console games (SNES, PS2, etc.)) Future proofing a game serves no purpose for me. Basically, I want to experience all the goodness the game has to offer in the here and now, because I most likely won't come back to it at a later date. Besides, it'd be like waiting to eat a hot dog until you can put some mustard on it, but when you get around to it you find that, although the mustard is delicious, you've found the bun has gone stale. Terrible analogy, I know. Lol

Well okay, my mistake. It doesn't only have to be for future PCs, but it can perhaps be for SLI users or something too.

Point is, it is not a bad thing if a good PC can't run everything in the game at its highest. You almost always have people who can run it better, and over time will have more people who are able to play the game. Not everyone needs to play it immediately or doesn't want to replay it, otherwise the point of the entire enhanced editions for the previous Witcher games would be pointless also.
 
Well okay, my mistake. It doesn't only have to be for future PCs, but it can perhaps be for SLI users or something too.

Point is, it is not a bad thing if a good PC can't run everything in the game at its highest. You almost always have people who can run it better, and over time will have more people who are able to play the game. Not everyone needs to play it immediately or doesn't want to replay it, otherwise the point of the entire enhanced editions for the previous Witcher games would be pointless also.

Oh, I know. I can only speak for myself. :)

And I do have sli 980's and a g-sync monitor, simply because I want to do the best I can to satisfy my petty wants and needs when it comes to pretty graphics in the here and now.
 

Damerman

Member
Gsync is my next upgrade. I'm waiting for a 1440p 28-30" IPS panel with > 60 hz support.

Any day now...
Waiting for a an ips g-sync is a wise choice. I went from ips to lcd for gsync. I somewhat regret my impatience. Hopefuly you have stronger character than i. The wait will very much be worth it.
 
really hoping i can make minor adjustments to get 60fps @ 1440p on my 980 - if i have to turn down too many toggles i might be forced to buy a titan x... :|
 

Watevaman

Member
If my 570 (lololol) can run it at low/medium at 1080p for a solid 30, I probably won't upgrade. If it can't, then it will be just like when I upgraded for W2.
 

Guri

Member
Yeah, I think I can wait until next year, since that's when I'll upgrade my PC. Besides, maybe they will also develop an Enhanced Edition!
 

Serandur

Member
Huh, less demanding than I had hoped. Well, it doesn't change my plans. Though I have no doubt my 290X would therefore do a good job at 2560x1440, I'm putting off The Witcher 3 until AMD release the 390X and Nvidia release a more reasonably-priced GM200 part.
 

Nzyme32

Member
Top Bottom