Aceofspades
Banned
PRO is punching way above its weight here, impressive result.
Damn those look good. Pro doing work here.Yep, probably not the best shots, but these are straight from my Pro. Down scaled to 1080p.
PRO is punching way above its weight here, impressive result.
I never understand how these conclusions are reached. How does one determine something like this?
I never understand how these conclusions are reached. How does one determine something like this?
Cost / performance evaluation. Sony nailed the sweet spot. A $399 console that can provide 4K HDR experience with some compromises for a fraction of a price of high end gaming PCs.
Problem with these comparisons is it's 30fps vs 60fps which requires double the GPU power, its running over the top higher settings that aren't really necessary and are usually extremely demanding requiring even more GPU power on top, all at a higher resolution which requires further GPU power again on top. What would be more interesting is a more genuine comparison of an RX480/960 running at medium or whatever is console equivalent targeting 30fps with a FX4100. A build like that should replicate what the Pro is doing for around £450-550.
What would you need card wise to run it at pro resolution and frame rate?
What would you need card wise to run it at pro resolution and frame rate?
I reckon either an RX 480 like the Pro has or a gtx 1060 (which performs slightly better). It's hard to find a benchmark that uses PS Pro settings on PC.
Cost / performance evaluation. Sony nailed the sweet spot. A $399 console that can provide 4K HDR experience with some compromises for a fraction of a price of high end gaming PCs.
The 470 is likely the tit for tat card of the PS4 Pro
Yea 480 or 1060 around the $200 range. As OCing is pretty much always done on something like the 1060 though because of the great TDP, it's going to perform better than the PS4 though. Costs around $200-250 depending on if you get the 3gb or 6gb version.
Either the 480 or 1060 with any reasonable newer i3 or up in terms of power will yield higher performance results than the Pro.
I have a Pro and a gaming PC.
Well there's a problem that for pc you need 60 fps or the gameplay/framepacing often feels wrong - I've tried doing 4K witcher 3 on my tv and game play felt really bad with 30 fps lock since mouse controls felt choppy. While console games usually work acceptably at that rate.
Yeah. well you are just affirming my points here. It's why I said, I've seen many PC posters declare those expensive effects transformative to the experience, when most times they're not even that great looking over the standard effects or better looking. An example is the Nvidia Fog/smoke in batman, is it realistic to fill the screen with thick plumes that looks way unrealistic, just to say your card is being maxed out?Titan X is not good value for money. But poeple don't buy it because it's value for money, they buy it because they want the best and can afford it.
With many PC game engines, the top 'ultra' settings for things like shadows, reflections, AO are disproportionately expensive in terms of power needed vs visual benefit.
That creates an environment where if you cherry pick your settings carefully, you can deliver 'good enough' visual quality on something like a pro. That should be applauded. Hopefully it also encourages more PC engines to look at 'cost saving' measures like checkerboard rendering so that more people can appreciate higher resolution gaming on the more mainstream GPUs
You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.Trying to hold a steady 60 FPS always seems like it requires a lot more than simply double the GPU power.
Things like CPU speed and memory bandwidth start to play a far bigger role for one thing.
I can easily run many games at a perfect 4K30 on my system (GTX 1070, 2500K@4.5GHz) but then it will struggle to hold a solid 1080p60.
If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....Paragon said:A lot of PC-specific graphical options are very demanding too.
Things like increased ambient occlusion quality can be massive performance hits but something like that is not likely to demo well in a YouTube video.
A lot of very high/ultra settings are things which look subtle in isolation but add up to really making the image look a lot more refined.
It seems like developers just sometimes throw these options in because some day in the future you will be able to turn them all on without worrying about the performance, rather than intending for people to use them now.
You have a capable system, try it and get back to us....Paragon said:I'm still surprised that Richard used the temporal filtering option in WD2 though.
I can run the game at native 4K at a solid 30 FPS on my system with a mixture of settings from high to ultra.
Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......Paragon said:I would think that their system could handle native 4K60 without having to resort to the temporal filtering.
It really hurts image quality as soon as you start to move the camera. Edges get a sawtooth appearance not too dissimilar to an interlaced display.
It's a shame they don't also include a more typical TAA option like most new games are using now. NVIDIA's TXAA is a massive performance hit, and results in weird image artifacting at times.
Value means more than you know, if I buy a Ferrari 488 GTB, I don't have to worry about speed, I know it is a screamer....It's not a pinto where I have to constantly be in tweaktown...(perhaps even to run or be motorable)...Paragon said:Value doesn't mean anything to me when gaming at 30 FPS with a low FoV makes me nauseous.
I've always gone for mid-range parts like the GTX 1070 and i5 CPUs instead of i7s because I find that they struck a good balance between performance and value for my budget, but I understand why many people buy high-end hardware.
It's really not that expensive compared to many other adult hobbies/pastimes, especially if you're only upgrading every few years.
If a game has awful textures, awful loadtimes, blurry lod, pop-in galore, below low graphical settings, awful performance, awful foliage in an open world setting......yet, on the flip, I see an open world game that looks better and performs better in every way, I mean looks and performs better in every way. Much better textures, lighting, DD, IQ, and perf Yeah you are absolutely right, not all games are made equal.Lowering settings is irrelevant, and I said in a previous post all games are not made equally so expecting games to perform to an arbitrary performance level at max settings is silly. The settings are also higher than what the consoles are capable of.
Hey great, purehair looks and runs better and it's no longer a resource hog as it used to be. I guess all gamers win here. It's a amazing what some optimization can do, when you would have a worse looking hair before with an uptick of 20-30fps on version 1.0. Heh, it even runs on a system decked with a jaguar CPU and 480 atm....I prefer extended advancement of the tech and optimizations tbh...LeLouch said:PureHair looks and runs better because optimizations were made.
Look at the extended video of ROTTR, it maintains 60fps in many scenes on the PRO. I'm not getting into a re-run of the "I don't like 40-50 fps debate". I've never said that PC's don't run it better, especially CPU's that are 5 months old. Heh' even your i3's runs circles on Jaguar, the fact that it's running 60fps in many scenes in ROTTR is mighty impressive. I think I was pretty clear on that....Lelouch said:Dips to 40-50 fps is not impressive at all, a 5 years and 10 months+ old CPU can do 60 fps with minor fluctuations below at stock clocks.
Two new PlayStation 4s and an Xbox have been released since this CPU was released in Janauary 2011.
What would you need card wise to run it at pro resolution and frame rate?
It runs a fair few triple a games in native 4K, and yes it does represent better value it's a £300 box for goodness sake.Ok, here are my objections. The Playstation Pro can't match a high end PC in visuals, image quality of framerate. It can't achieve native 4K in triple-A games. Does it represent better value than a mid-range PC in cost-to-performance ratio?
It runs a fair few triple a games in native 4K, and yes it does represent better value it's a £300 box for goodness sake.
If a game has awful textures, awful loadtimes, blurry lod, pop-in galore, below low graphical settings, awful performance, awful foliage in an open world setting......yet, on the flip, I see an open world game that looks better and performs better in every way, I mean looks and performs better in every way. Much better textures, lighting, DD, IQ, and perf Yeah you are absolutely right, not all games are made equal.
Sometimes I fear we use awfully made and badly coded games which hamstring performance as examples for counter arguments, but it does not work that way. We are simply looking at a properly coded open world game here. Those awful open world games that fall into the teens with below low graphical settings are not bastions of good coding or great standards. Not all games are made equally indeed, that's why Fallout 4 and Just cause 3 are what they are.......
Hey great, purehair looks and runs better and it's no longer a resource hog as it used to be. I guess all gamers win here. It's a amazing what some optimization can do, when you would have a worse looking hair before with an uptick of 20-30fps on version 1.0. Heh, it even runs on a system decked with a jaguar CPU and 480 atm....I prefer extended advancement of the tech and optimizations tbh...
Look at the extended video of ROTTR, it maintains 60fps in many scenes on the PRO. I'm not getting into a re-run of the "I don't like 40-50 fps debate". I've never said that PC's don't run it better, especially CPU's that are 5 months old. Heh' even your i3's runs circles on Jaguar, the fact that it's running 60fps in many scenes in ROTTR is mighty impressive. I think I was pretty clear on that....
Yeah. well you are just affirming my points here. It's why I said, I've seen many PC posters declare those expensive effects transformative to the experience, when most times they're not even that great looking over the standard effects or better looking. An example is the Nvidia Fog/smoke in batman, is it realistic to fill the screen with thick plumes that looks way unrealistic, just to say your card is being maxed out?
Hey, I have no problems with persons buying what they like and spending their cash, I'm just arguing the benefits of their purchase for such a huge investment. Also, I think many including you are trying to minimize the Pro's graphical settings here. These are not "good enough" settings here, these are great graphical settings on the PRO, the only settings left out is the computationally expensive SSR, which is also left out on the 11TF 1500+ GPU in lieu of performance.
Heh, if you want to see below par graphical settings, check out Witcher 3 on consoles with putrid performance. What's running here on consoles with the IQ, textures, DD and perf is ideally what console owners want in their openworld games, not what CDPRoject delivered, as an example......
You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.
If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....
Funny enough, when these effects are optimized for consoles and lesser GPU's, not only do they look better but also run at only a fraction of the cost. Here's the clincher though, many times owners of these high end GPU's don't even run their games with these expensive effects on because it's so expensive, (60 or bust right?).... it was that way for tressfx, it was that way hairworks, it's that way for vxao and PCSS...etc...
You have a capable system, try it and get back to us....
Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......
Value means more than you know, if I buy a Ferrari 488 GTB, I don't have to worry about speed, I know it is a screamer....It's not a pinto where I have to constantly be in tweaktown...(perhaps even to run or be motorable)...
$1200.00 is a lot of money, let me ask you a question. If the PS4 PRO cost $1200.00 don't you think it would run Watchdogs at 4k native 60fps with SSR? Just think about it carefully.....Also, I've yet to see a Titan X pascal on Amazon for that price, but I digress. I'd love to hear you thoughts on the question posed....
How do you know that though? Is determining value really as simplistic as seeing which costs less? If a gaming PC that costs 2x delivers 2.5x the performance of the Pro, isn'that PC better value?
With a better CPU I'd Imagine......
Well a 1080p 60fps option was not provided so we don't know how well it would run. Do you know how well it would run without stats? Also ROTTR seems to be holding 60fps quite well in smaller levels and fight scenes and runs about 40-50fps in much larger scenes with lots of A.I. I'll say that's pretty impressive still for Jaguars.
How do you know that though? Is determining value really as simplistic as seeing which costs less? If a gaming PC that costs 2x delivers 2.5x the performance of the Pro, isn'that PC better value?
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.
This is entirely the wrong conclusion to draw from this. Not even sure how you arrived at that. Especially when looking at a youtube video.
By this logic the fact that Ori and the blind forest looks IDENTICAL on the non pro PS4 and a PC with 1080's in SLI means the PS4 surely is punching above it's weight! Equaling a $1800 PC!
Watchdogs 2 is just one game. The differences will be more drastic and less drastic accross different games. Not to mention that again we are talking about 60 FPS vs 30 FPS and higher graphics settings.
You can beat a PS4 Pro with a LOT less GPU and CPU.
I bet you the differences between an i3 + 1050ti PC and a PS4 Pro are probably SMALLER than the differences between the Ps4 Pro and PC in this video.
I also bet you that the differences would be highlighted by GAF in very much the opposite way:
"But do you see that extra little piece of shadow pixelation? The PS4 Pro blows it out of the water!!".
I intend to upgrade my CPU when Kaby Lake is released.You're using an older CPU, get a Skylake i7 like DF has. Console owners are more likely to say what you've just said, it's just strange coming from a PC gamer. Your CPU is so much better than the console CPU, your GPU is so much better. 60fps gaming is easier to achieve on PC for all the reasons you know.
The difference between something like good ambient occlusion and bad is very noticeable when the game is in front of you. YouTube compression tends to hide a lot of problems with image quality - especially differences in fine detail or temporal issues.If it's so expensive and it's not easily discernible, then what's the point, if an effect is costing me 20-30fps in some cases, it had better be discernible in youtube videos....
Funny enough, when these effects are optimized for consoles and lesser GPU's, not only do they look better but also run at only a fraction of the cost. Here's the clincher though, many times owners of these high end GPU's don't even run their games with these expensive effects on because it's so expensive, (60 or bust right?).... it was that way for tressfx, it was that way hairworks, it's that way for vxao and PCSS...etc...
Who said that it looks worse?You have a capable system, try it and get back to us....
Case in point...an expensive effect that's computationally expensive by Nvidia but looks worse than a less taxing and more standard solution. I guess, they'll optimize that next year at a fraction of the cost with IQ benefits no doubt.......
Use half-refresh V-Sync and frame pacing should not be an issue.Well there's a problem that for pc you need 60 fps or the gameplay/framepacing often feels wrong - I've tried doing 4K witcher 3 on my tv and game play felt really bad with 30 fps lock since mouse controls felt choppy. While console games usually work acceptably at that rate.
Here are three screenshots from the game running on my PC.
No-one should be able to pick out which one is running at ~1800p with temporal filtering, which one is native 4K with SMAA, and which one is using TXAA, right?
There's a lot of artifacts on thin objects and edges in the second screenshot. Also the two yellow lines on the road frequently cross/touch in the second screenshot whereas the lines don't cross/touch in the other two screenshots.Here are three screenshots from the game running on my PC.
No-one should be able to pick out which one is running at ~1800p with temporal filtering, which one is native 4K with SMAA, and which one is using TXAA, right?
That's why I don't like these lower-than-native resolution upscaling techniques.There's a lot of artifacts on thin objects and edges in the second screenshot.
You are paying a fraction of the price for a fraction of the performance.The point is that the Pro is a $400 system that is pulling off visuals quite close to a PC with a $1000 GPU in Watch Dogs 2. The primary differences are shadow quality and framerate. The value proposition is simply that you can get notably close graphics for a fraction of the price.