• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Watch Dogs 2 PC 4K (Titan X) vs PS4 Pro

As was already said.. set the Titan X PC to the PS4 Pro's visual settings and watch the framerate shoot up. The truth is that in many cases the PC's highest settings in games is needlessly (depending on who you ask) wasting resources compared to the visual payoff those highest settings provide.

I'm not saying that it's not incredibly impressive on the part of the Pro, however. It absolutely is. And it's great to see developers and engineers working on ways to more efficiently render graphics in games.
 

leng jai

Member
Simply going from 30 to 60fps is a huge upgrade in the first place regardless of whether you think it's a big difference.
 
I never understand how these conclusions are reached. How does one determine something like this?

Cost / performance evaluation. Sony nailed the sweet spot. A $399 console that can provide 4K HDR experience with some compromises for a fraction of a price of high end gaming PCs.
 

elelunicy

Member
I'm not sure why anyone would be surprised by the performance difference. A single Titan X is only around 3 times as fast as the Pro's GPU. At 2160p/60fps you need to push 2.88 times as many pixels as 1800p/30fps. The performance is exactly what you'd expect from both systems.
 

kraspkibble

Permabanned.
this is ridiculous. the PS4 Pro is nowhere near as powerful as a Titan X. a pro is only 4.2TF or something and the Titan X is 11TF. not even a scorpio will come close to a Titan X. PC gamers aren't looking to play at 2160p 30fps lol but 60fps. i feel this is only going to mislead people into thinking a pro can keep up with the most powerful/expensive GPU out there right now.
 
Problem with these comparisons is it's 30fps vs 60fps which requires double the GPU power, its running over the top higher settings that aren't really necessary and are usually extremely demanding requiring even more GPU power on top, all at a higher resolution which requires further GPU power again on top. What would be more interesting is a more genuine comparison of an RX480/960 running at medium or whatever is console equivalent targeting 30fps with a FX4100. A build like that should replicate what the Pro is doing for around £450-550.

Pretty much.

But I can see reason for why they're holding off on that.

In other pc vs xbox/ps4 comparisons, they usually put up a pc that came close in budget like the older videos.

It might be best to see if zen churns out any good budget cpus before attempting it. An amd zen cpu combo should be interesting vs pro and scorpio paired with the 1060 or 470/480. Then we can see similar budget comparisons to see where the pro stands.
 
What would you need card wise to run it at pro resolution and frame rate?

The 470 is likely the tit for tat card of the PS4 Pro

Yea 480 or 1060 around the $200 range. As OCing is pretty much always done on something like the 1060 though because of the great TDP, it's going to perform better than the PS4 though. Costs around $200-250 depending on if you get the 3gb or 6gb version.

Either the 480 or 1060 with any reasonable newer i3 or up in terms of power will yield higher performance results than the Pro.

I have a Pro and a gaming PC.
 

belmonkey

Member
I reckon either an RX 480 like the Pro has or a gtx 1060 (which performs slightly better). It's hard to find a benchmark that uses PS Pro settings on PC.

Ideally that's what Digital Foundry should be trying to do. I've tried some benchmarks in Skyrim and Deus Ex MD with an RX 470, but variable resolutions in some games and not knowing the exact settings complicates things :/
 
Cost / performance evaluation. Sony nailed the sweet spot. A $399 console that can provide 4K HDR experience with some compromises for a fraction of a price of high end gaming PCs.

Ok, here are my objections. The Playstation Pro can't match a high end PC in visuals, image quality of framerate. It can't achieve native 4K in triple-A games. Does it represent better value than a mid-range PC in cost-to-performance ratio?
 
The 470 is likely the tit for tat card of the PS4 Pro

Yea 480 or 1060 around the $200 range. As OCing is pretty much always done on something like the 1060 though because of the great TDP, it's going to perform better than the PS4 though. Costs around $200-250 depending on if you get the 3gb or 6gb version.

Either the 480 or 1060 with any reasonable newer i3 or up in terms of power will yield higher performance results than the Pro.

I have a Pro and a gaming PC.

Well there's a problem that for pc you need 60 fps or the gameplay/framepacing often feels wrong - I've tried doing 4K witcher 3 on my tv and game play felt really bad with 30 fps lock since mouse controls felt choppy. While console games usually work acceptably at that rate.
 
Well there's a problem that for pc you need 60 fps or the gameplay/framepacing often feels wrong - I've tried doing 4K witcher 3 on my tv and game play felt really bad with 30 fps lock since mouse controls felt choppy. While console games usually work acceptably at that rate.

The mouse is a high-accuracy device, it benefits from higher framerates. Play with a gamepad.
 

thelastword

Banned
Titan X is not good value for money. But poeple don't buy it because it's value for money, they buy it because they want the best and can afford it.

With many PC game engines, the top 'ultra' settings for things like shadows, reflections, AO are disproportionately expensive in terms of power needed vs visual benefit.

That creates an environment where if you cherry pick your settings carefully, you can deliver 'good enough' visual quality on something like a pro. That should be applauded. Hopefully it also encourages more PC engines to look at 'cost saving' measures like checkerboard rendering so that more people can appreciate higher resolution gaming on the more mainstream GPUs
Yeah. well you are just affirming my points here. It's why I said, I've seen many PC posters declare those expensive effects transformative to the experience, when most times they're not even that great looking over the standard effects or better looking. An example is the Nvidia Fog/smoke in batman, is it realistic to fill the screen with thick plumes that looks way unrealistic, just to say your card is being maxed out?

Hey, I have no problems with persons buying what they like and spending their cash, I'm just arguing the benefits of their purchase for such a huge investment. Also, I think many including you are trying to minimize the Pro's graphical settings here. These are not "good enough" settings here, these are great graphical settings on the PRO, the only settings left out is the computationally expensive SSR, which is also left out on the 11TF 1500+ GPU in lieu of performance.

Heh, if you want to see below par graphical settings, check out Witcher 3 on consoles with putrid performance. What's running here on consoles with the IQ, textures, DD and perf is ideally what console owners want in their openworld games, not what CDPRoject delivered, as an example......

Trying to hold a steady 60 FPS always seems like it requires a lot more than simply double the GPU power.
Things like CPU speed and memory bandwidth start to play a far bigger role for one thing.
I can easily run many games at a perfect 4K30 on my system (GTX 1070, 2500K@4.5GHz) but then it will struggle to hold a solid 1080p60.
You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.

Paragon said:
A lot of PC-specific graphical options are very demanding too.
Things like increased ambient occlusion quality can be massive performance hits but something like that is not likely to demo well in a YouTube video.
A lot of very high/ultra settings are things which look subtle in isolation but add up to really making the image look a lot more refined.
It seems like developers just sometimes throw these options in because some day in the future you will be able to turn them all on without worrying about the performance, rather than intending for people to use them now.
If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....

Funny enough, when these effects are optimized for consoles and lesser GPU's, not only do they look better but also run at only a fraction of the cost. Here's the clincher though, many times owners of these high end GPU's don't even run their games with these expensive effects on because it's so expensive, (60 or bust right?).... it was that way for tressfx, it was that way hairworks, it's that way for vxao and PCSS...etc...


Paragon said:
I'm still surprised that Richard used the temporal filtering option in WD2 though.
I can run the game at native 4K at a solid 30 FPS on my system with a mixture of settings from high to ultra.
You have a capable system, try it and get back to us....


Paragon said:
I would think that their system could handle native 4K60 without having to resort to the temporal filtering.
It really hurts image quality as soon as you start to move the camera. Edges get a sawtooth appearance not too dissimilar to an interlaced display.
It's a shame they don't also include a more typical TAA option like most new games are using now. NVIDIA's TXAA is a massive performance hit, and results in weird image artifacting at times.
Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......

Paragon said:
Value doesn't mean anything to me when gaming at 30 FPS with a low FoV makes me nauseous.
I've always gone for mid-range parts like the GTX 1070 and i5 CPUs instead of i7s because I find that they struck a good balance between performance and value for my budget, but I understand why many people buy high-end hardware.
It's really not that expensive compared to many other adult hobbies/pastimes, especially if you're only upgrading every few years.
Value means more than you know, if I buy a Ferrari 488 GTB, I don't have to worry about speed, I know it is a screamer....It's not a pinto where I have to constantly be in tweaktown...(perhaps even to run or be motorable)...

$1200.00 is a lot of money, let me ask you a question. If the PS4 PRO cost $1200.00 don't you think it would run Watchdogs at 4k native 60fps with SSR? Just think about it carefully.....Also, I've yet to see a Titan X pascal on Amazon for that price, but I digress. I'd love to hear you thoughts on the question posed....


Lowering settings is irrelevant, and I said in a previous post all games are not made equally so expecting games to perform to an arbitrary performance level at max settings is silly. The settings are also higher than what the consoles are capable of.
If a game has awful textures, awful loadtimes, blurry lod, pop-in galore, below low graphical settings, awful performance, awful foliage in an open world setting......yet, on the flip, I see an open world game that looks better and performs better in every way, I mean looks and performs better in every way. Much better textures, lighting, DD, IQ, and perf Yeah you are absolutely right, not all games are made equal.

Sometimes I fear we use awfully made and badly coded games which hamstring performance as examples for counter arguments, but it does not work that way. We are simply looking at a properly coded open world game here. Those awful open world games that fall into the teens with below low graphical settings are not bastions of good coding or great standards. Not all games are made equally indeed, that's why Fallout 4 and Just cause 3 are what they are.......

LeLouch said:
PureHair looks and runs better because optimizations were made.
Hey great, purehair looks and runs better and it's no longer a resource hog as it used to be. I guess all gamers win here. It's a amazing what some optimization can do, when you would have a worse looking hair before with an uptick of 20-30fps on version 1.0. Heh, it even runs on a system decked with a jaguar CPU and 480 atm....I prefer extended advancement of the tech and optimizations tbh...

Lelouch said:
Dips to 40-50 fps is not impressive at all, a 5 years and 10 months+ old CPU can do 60 fps with minor fluctuations below at stock clocks.
Two new PlayStation 4s and an Xbox have been released since this CPU was released in Janauary 2011.
Look at the extended video of ROTTR, it maintains 60fps in many scenes on the PRO. I'm not getting into a re-run of the "I don't like 40-50 fps debate". I've never said that PC's don't run it better, especially CPU's that are 5 months old. Heh' even your i3's runs circles on Jaguar, the fact that it's running 60fps in many scenes in ROTTR is mighty impressive. I think I was pretty clear on that....
 

madmackem

Member
Ok, here are my objections. The Playstation Pro can't match a high end PC in visuals, image quality of framerate. It can't achieve native 4K in triple-A games. Does it represent better value than a mid-range PC in cost-to-performance ratio?
It runs a fair few triple a games in native 4K, and yes it does represent better value it's a £300 box for goodness sake.
 
It runs a fair few triple a games in native 4K, and yes it does represent better value it's a £300 box for goodness sake.

How do you know that though? Is determining value really as simplistic as seeing which costs less? If a gaming PC that costs 2x delivers 2.5x the performance of the Pro, isn'that PC better value?
 
If a game has awful textures, awful loadtimes, blurry lod, pop-in galore, below low graphical settings, awful performance, awful foliage in an open world setting......yet, on the flip, I see an open world game that looks better and performs better in every way, I mean looks and performs better in every way. Much better textures, lighting, DD, IQ, and perf Yeah you are absolutely right, not all games are made equal.

Sometimes I fear we use awfully made and badly coded games which hamstring performance as examples for counter arguments, but it does not work that way. We are simply looking at a properly coded open world game here. Those awful open world games that fall into the teens with below low graphical settings are not bastions of good coding or great standards. Not all games are made equally indeed, that's why Fallout 4 and Just cause 3 are what they are.......


Hey great, purehair looks and runs better and it's no longer a resource hog as it used to be. I guess all gamers win here. It's a amazing what some optimization can do, when you would have a worse looking hair before with an uptick of 20-30fps on version 1.0. Heh, it even runs on a system decked with a jaguar CPU and 480 atm....I prefer extended advancement of the tech and optimizations tbh...

Look at the extended video of ROTTR, it maintains 60fps in many scenes on the PRO. I'm not getting into a re-run of the "I don't like 40-50 fps debate". I've never said that PC's don't run it better, especially CPU's that are 5 months old. Heh' even your i3's runs circles on Jaguar, the fact that it's running 60fps in many scenes in ROTTR is mighty impressive. I think I was pretty clear on that....

That's not the point I was trying to make about games being made equally and you know that. You spun what I said so you can fabricate another conclusion.

I'm specifically talking about all games not performing equally at max settings.

On a system, one game you could max out at 90 fps, another you might be able to max out at 60, and another at 45.

Running on a system with a Jaguar CPU is irrelevant, PureHair runs on the GPU.

5 months old? 5 years old, not months, and it's running it at 60 fps.
The 2500K released in January 2011.

Barely scraping 60 fps is unimpressive when a CPU that's over 5 years old can do it at stock clocks. A CPU that has seen two new PlayStations (minus the Vita) and an Xbox.

Yeah. well you are just affirming my points here. It's why I said, I've seen many PC posters declare those expensive effects transformative to the experience, when most times they're not even that great looking over the standard effects or better looking. An example is the Nvidia Fog/smoke in batman, is it realistic to fill the screen with thick plumes that looks way unrealistic, just to say your card is being maxed out?

Hey, I have no problems with persons buying what they like and spending their cash, I'm just arguing the benefits of their purchase for such a huge investment. Also, I think many including you are trying to minimize the Pro's graphical settings here. These are not "good enough" settings here, these are great graphical settings on the PRO, the only settings left out is the computationally expensive SSR, which is also left out on the 11TF 1500+ GPU in lieu of performance.

Heh, if you want to see below par graphical settings, check out Witcher 3 on consoles with putrid performance. What's running here on consoles with the IQ, textures, DD and perf is ideally what console owners want in their openworld games, not what CDPRoject delivered, as an example......


You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.

If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....

Funny enough, when these effects are optimized for consoles and lesser GPU's, not only do they look better but also run at only a fraction of the cost. Here's the clincher though, many times owners of these high end GPU's don't even run their games with these expensive effects on because it's so expensive, (60 or bust right?).... it was that way for tressfx, it was that way hairworks, it's that way for vxao and PCSS...etc...



You have a capable system, try it and get back to us....



Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......


Value means more than you know, if I buy a Ferrari 488 GTB, I don't have to worry about speed, I know it is a screamer....It's not a pinto where I have to constantly be in tweaktown...(perhaps even to run or be motorable)...

$1200.00 is a lot of money, let me ask you a question. If the PS4 PRO cost $1200.00 don't you think it would run Watchdogs at 4k native 60fps with SSR? Just think about it carefully.....Also, I've yet to see a Titan X pascal on Amazon for that price, but I digress. I'd love to hear you thoughts on the question posed....

5Q6FcGA.jpg


What did I just read?

Do you even know how any of this works?

How much are they paying you? Seriously, bring me in too!
 

Lister

Banned
How do you know that though? Is determining value really as simplistic as seeing which costs less? If a gaming PC that costs 2x delivers 2.5x the performance of the Pro, isn'that PC better value?

Of course it is silly.

Cheaper = better value always and for everyone. I know Mcdonalds offers me a better "value" than the fantastic french restaurant down my street. My scooter also offers me better value than my SUV. that thing guzzles gas like there's no tomorrow.

It's hard to take the fmaily out on road trips, but whatevs man - the VALUE! My scooter was only $125, my car was THOUSANDS of dollars. I feel robbed.

Heh, no of course you are right. Value is not objective, it depend son what you are looking for and what you value more.

I value not waiting for minute plus long load times. I value having access to hundreds of fantastic exclusives, I value being able to custumize my gaming experience. I value not having to hope and beg for developers ot release a patch to take advantage of my hardware. I value not having to pay for online services. I value being able ot play msot games at 21:9 ultra wide resolutions and at 60 FPS at beter settings than a console.

And I'm wiling to pay more for that, especially when I end up saving a ton in the long run via software deals.
 

Momentary

Banned
What's the lowest price for WD2 on PC right now? I'm thinking about picking it up even though I don't play these games just to see what my Titan X setup will do with this.
 
With a better CPU I'd Imagine......

I'm glad you're impressed with the Jaguar. It's still a POS CPU.

Imagine if we had gotten a higher end i5 equivalent at the beginning of this generation. Physics, geometry, AI...could have all been better. It's nice that they found a value good enough CPU that pushed multi threading.

Also the point of the higher end CPUs is to run 60fps or higher. I know you like to push the point that in order for the PS4 to run at a locked 30fps it has to run much higher than that most of the time and this is true. It's also true that in order to run at a locked 60fps you need to run much higher than that most of the time. It's also much more difficult to do that. It's not just a matter of using CPU that's twice as fast.

Either way. The PS4 and Pro are great values, but I still prefer my 6600K/GTX 1070. 60fps da bes
 

adamsapple

Or is it just one of Phil's balls in my throat?
Well a 1080p 60fps option was not provided so we don't know how well it would run. Do you know how well it would run without stats? Also ROTTR seems to be holding 60fps quite well in smaller levels and fight scenes and runs about 40-50fps in much larger scenes with lots of A.I. I'll say that's pretty impressive still for Jaguars.

To play devil's advocate. The frame drops on Pro were highlighted in the Geo Thermal valley location, which is a known stress point even for PCs. Other then that the game holds frame rate much better than that area at least.
 

Koobion

Member
How do you know that though? Is determining value really as simplistic as seeing which costs less? If a gaming PC that costs 2x delivers 2.5x the performance of the Pro, isn'that PC better value?

The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
 

MaLDo

Member
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.

If you are ok with 15 fps you can build a pc cheaper than the pro to play with similar graphics than the pro with only shadows and framerate as primary differences.
 
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.

You can get those same visuals on a $230 GPU hell a $150 GPU if you want to play at 15fps like MaLDo said. The point is that at "max" settings the GPU is maintaining 60fps more or less, which is a much much more difficult task than maintaining 30fps.

I had a 7870XT paired with my 6600K before I got my 1070. The Witcher 3 wouldn't run a locked 60fps unless I dropped resolution from 1080p and lowered settings a lot. I could run a locked 30fps no problem with a mix of high/max settings.

Point is you can get close visuals on a Pro, but at 30fps. I'm really liking playing everything at 60fps right now. I don't care that I paid twice as much, because it's really not that much to spend on a hobby.
 
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.

I have quite a few issues with your argument, although I understand what you mean. The first issue is that the term "notably close graphics" is a subjective one. The high-end PC offers significantly higher resolution, at least double the framerate and additional high-end effects. Each gamer must decide for himself if the difference is close or not but any conclusion can't be used to objectively determine value. The second issue is that the Titan X and bleeding edge hardware in general is overpriced because enthusiasts will pay for the very best available. If you want to determine the PS4 Pro's value then you have to compare it to other value-oriented hardware like a mid-range PC. So if someone wants to argue that the PS4 Pro is good value I would like to see some sort of evidence to that fact.
 

Lister

Banned
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.

This is entirely the wrong conclusion to draw from this. Not even sure how you arrived at that. Especially when looking at a youtube video.

By this logic the fact that Ori and the blind forest looks IDENTICAL on the non pro PS4 and a PC with 1080's in SLI means the PS4 surely is punching above it's weight! Equaling a $1800 PC!

Watchdogs 2 is just one game. The differences will be more drastic and less drastic accross different games. Not to mention that again we are talking about 60 FPS vs 30 FPS and higher graphics settings.

You can beat a PS4 Pro with a LOT less GPU and CPU.

I bet you the differences between an i3 + 1050ti PC and a PS4 Pro are probably SMALLER than the differences between the Ps4 Pro and PC in this video.

I also bet you that the differences would be highlighted by GAF in very much the opposite way:

"But do you see that extra little piece of shadow pixelation? The PS4 Pro blows it out of the water!!".
 
This is entirely the wrong conclusion to draw from this. Not even sure how you arrived at that. Especially when looking at a youtube video.

By this logic the fact that Ori and the blind forest looks IDENTICAL on the non pro PS4 and a PC with 1080's in SLI means the PS4 surely is punching above it's weight! Equaling a $1800 PC!

Watchdogs 2 is just one game. The differences will be more drastic and less drastic accross different games. Not to mention that again we are talking about 60 FPS vs 30 FPS and higher graphics settings.

You can beat a PS4 Pro with a LOT less GPU and CPU.

I bet you the differences between an i3 + 1050ti PC and a PS4 Pro are probably SMALLER than the differences between the Ps4 Pro and PC in this video.

I also bet you that the differences would be highlighted by GAF in very much the opposite way:

"But do you see that extra little piece of shadow pixelation? The PS4 Pro blows it out of the water!!".

We already got this with PS4 vs XB1 vs PC.

"PS4 is almost as good as a $1000 PC! The Xbone is so blurry at 900p!"
 

Paragon

Member
You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.
I intend to upgrade my CPU when Kaby Lake is released.
My point is that 30 FPS is easy on PC.
I can run games at 4K30 without being held back by a 6-year-old CPU at all.
60 FPS requires a lot more than just double the GPU power, and I don't think many console players realize that. It can be more difficult to run games at 1080p60 than 4K30.
Running something like Watch Dogs 2 at 60 FPS is not a trivial feat compared to running it at higher resolutions, but only at 30 FPS.

If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....

Funny enough, when these effects are optimized for consoles and lesser GPU's, not only do they look better but also run at only a fraction of the cost. Here's the clincher though, many times owners of these high end GPU's don't even run their games with these expensive effects on because it's so expensive, (60 or bust right?).... it was that way for tressfx, it was that way hairworks, it's that way for vxao and PCSS...etc...
The difference between something like good ambient occlusion and bad is very noticeable when the game is in front of you. YouTube compression tends to hide a lot of problems with image quality - especially differences in fine detail or temporal issues.

I'm not really sure what your point is though.
That you don't need to buy a Titan X if you want PS4 Pro-level performance?
That no-one can tell the difference between a maxed-out PC game and the same game running on PS4 Pro?
That the difference between 30 and 60 FPS isn't worthwhile?
That there's no point in buying a PC because it's more expensive?
That additional effects should not be included in the PC version of games because the consoles couldn't handle them and everything should have parity?
That developers shouldn't include graphical options which are not intended to be used on today's hardware, but are included because the availability/backwards compatibility of a PC game is often measured in decades?

It just seems like you keep changing your arguments to whatever meets your agenda of "don't buy or play games on a PC".

You have a capable system, try it and get back to us....
Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......
Who said that it looks worse?
My whole argument is that the artifacts from Ubisoft's temporal filtering really hurts image quality compared to rendering natively at 4K or using TXAA, and I think that's what Richard should have been targeting.

Here are three screenshots from the game running on my PC.
No-one should be able to pick out which one is running at ~1800p with temporal filtering, which one is native 4K with SMAA, and which one is using TXAA, right?
The difference between the temporal methods of anti-aliasing is also something which is going to be much more noticeable in motion than a screenshot.
The SMAA shot is using the settings which run at 4K30 on my system. SMAA appears more effective at removing aliasing than it really is when the scene is static.

I don't necessarily think that TXAA is worth the performance hit right now, especially when there are some very good TAA post-process filters with almost no performance hit available in other games, but it's definitely the highest quality option that the game offers.
A lot of the older games which included TXAA support are now playable at 60+ FPS with it enabled, when it originally killed performance back when those games were released. The same will be true of Watch Dogs 2 one day as well.

Watch Dogs 2 seems like a pretty good port though, as the PS4 Pro version looks good for what it's running on. The difference in image quality between something like Dishonored 2 running on PC at 4K30 and running on PS4 Pro seems to be far larger for example.
I don't see how the game looking good on PS4 Pro somehow makes the PC version worse.

Well there's a problem that for pc you need 60 fps or the gameplay/framepacing often feels wrong - I've tried doing 4K witcher 3 on my tv and game play felt really bad with 30 fps lock since mouse controls felt choppy. While console games usually work acceptably at that rate.
Use half-refresh V-Sync and frame pacing should not be an issue.
The problem with frame pacing at 30 FPS occurs when you simply use a framerate limiter combined with 60Hz V-Sync.
 

Caayn

Member
Here are three screenshots from the game running on my PC.
No-one should be able to pick out which one is running at ~1800p with temporal filtering, which one is native 4K with SMAA, and which one is using TXAA, right?
There's a lot of artifacts on thin objects and edges in the second screenshot. Also the two yellow lines on the road frequently cross/touch in the second screenshot whereas the lines don't cross/touch in the other two screenshots.
 

Paragon

Member
There's a lot of artifacts on thin objects and edges in the second screenshot.
That's why I don't like these lower-than-native resolution upscaling techniques.
As soon as you start moving the camera the image just falls apart.
Something like DOOM's TSSAA is better because it starts with a native resolution image and accumulates data from multiple frames to improve it.
It's far from perfect compared to traditional anti-aliasing or supersampling techniques, but is a significant improvement over other types of post-process anti-aliasing and the temporal filtering component can look really good - even catching shader aliasing that supersampling does little for.
TAA plus even a little bit of supersampling starts to look really nice.
It's surprising how much aliasing stands out when you go back to 2015 or earlier titles that don't use any form of TAA after getting used to it in newer games.
That said, TAA does have problems of its own. Ghosting in Deus Ex: Mankind Divided and Skyrim: Special Edition was really bad.
 
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
You are paying a fraction of the price for a fraction of the performance.

It runs at a higher res (how much is it? Double?) double the framerate and with higher quality settings.

The price/performance ratio is probably better on the ps pro, but let's not pretend it's anywhere close in performance to the higher priced pc.
 
Top Bottom