Is it possible for the dev to make you choose settings on console? You know like PC?
why can't they bring higher frame rates into it instead of 4k? i'd love to console game at something above 60fps
why can't they bring higher frame rates into it instead of 4k? i'd love to console game at something above 60fps
Indeed, especially as 2k and 4k textures have been widely used in games for a long time.4K textures isn't a thing, unless you're literally saying every texture has a 4K resolution which is silly.
Yes.One important thing that screenshots don't show is the huge increase in motion stability. Having way less pixel shimmering with movement is a big deal and adds to an overall sharp and clean looking IQ.
It stuck out to me that Andrew House didn't use 4K as a selling point in his last statement about the Neo, but instead said that games will "play an awful lot prettier."
You are saying it hides blur with motion blur? ok.
But TSSAA does not causes blurring in motion, it can cause artifacting like dithering when in motion but not blurring and it certainly doesn't "hide it with motion blur"...there's no such thing as hiding something with motion blur. That particular statement has always bugged me. And again 30FPS has lower temporal resolution, unless you use motion blur the gaps between frames is so easily visible that I can point them out individually.
Indeed, especially as 2k and 4k textures have been widely used in games for a long time.
Resolution of textures is independent from the framebuffer size. (Old good use 4k texture for an arrow.)
Yes.
Funnily having 4k buffer one can get resolution around 1080p worth of stable detail on moving picture. (Smallest detail being 2 pixels in 4k buffer, estimation but lot better temporal stability than 1 pixel for smallest point.)
My hope is that some developers would try variable shading with MSAA trick or similar methods to get shading cost near 1080p and output similar to 4k buffer where sharpest region would have almost all detail for native 4k.
This and especially for 'games using HDR and filmic' games would use a proper gaussian resolve for AA to get the temporal stable image as good as it can get. (Which reduces the spatial resolution a bit, but would look lot better than normal 1080p.)
Would also love to see 'native' 4k rendered game which would use virtual texturing and texture space shading to get really high quality results, especially if game goes for very static lighting environment, perhaps dusty low specular world in an adventure game. (IE. the Dig)
With slow moving camera this could allow near perfect image quality for a HDR 4k. (A lot of shading could be used for a long time without changes, shading rate could get lot lower than 1080p per frame.)
Consumer preference. Raising FPS is still more demanding than raising to 4k, especially at higher resolutions.
That image doesn't have to worry about things like aliasing. Because it's a photo of real life and real life doesn't have aliasing.
Aliasing comes in all forms. Something you need to take into account is the fact that real life also doesn't need to worry about LODmipmapsetc. All of those trees, houses, etc in the distance are still perfectly detailed. If that was a video game, those distant objects would all be very complex for something so small. And with their maximum resolution textures, you'd get some really nasty shimmering or "pixel crawl." And the methods to reduce that are very expensive on the GPU (unless you want a blurry image). So if you want that sharp image, you either downsample from a larger resolution, or simple display the larger resolution natively.
It stuck out to me that Andrew House didn't use 4K as a selling point in his last statement about the Neo, but instead said that games will "play an awful lot prettier."
It's possible he just didn't want to unveil a marketing pitch early, but that might have also been the feedback they got from developers once they got their hands on the hardware.
One thing worth considering about sticking with 1080p is that 4K is exactly four times as many pixels, so the scaling is much, much cleaner than going from 720p -> 1080p or some other uneven multiple. I suspect that's why the TV industry jumped on this resolution as well, since they knew most of the content would be 1080p, but wanted it to still look clean and good on new sets.
No, most definitely isn't. 1080p 60fps is easier on something as powerful as Scorpio than 4K 30fps. Far easier. I'm very happy to hear developers not falling into the 4K trap. This console's power will be far better utilized by not boxing your game into a 4K resolution requirement.
720p to 1080p is twice the number of pixels.
But 1080p to 1440p is 1.5 times, i believe.
Also Andrew House absolutely knows that PSNeo won't do 4k, its a R280X, R380X equivalent console. it's a jump from a 750Ti to a 280X, so 1080p/30fps to 1080/60fps.... its definitely not a jump to 4k.
As someone running a 1440p/144hz screen... 1440p is taxing enough, and I honestly should have just bought a 1080p screen. A GTX 970 isn't really cutting it at this resolution, at least for things like The Witcher 3 and The Division, both of which I'm playing on medium settings and getting 60fps.
The Scorpio will be like a GTX 1070, but even then that will only get you 1440p/60 maxed, so be prepared for 4k games at medium/low settings.
It's going to take the GTX 1170, 1270 series to do 4k/60 maxed... that's still 1.5-3 years away, and at least one more console generation away. 2019 for proper 4k/60... by then people will be asking for 8k.
The scorpio is the 480. Which is about 40-50% slower than the 1070. It's a 1080p/60fps higher settings card. Or a 4k/30 card at mid/high.
Think of it this way, if it came down to it would you prefer these visuals at 1080p:
.
Or these visuals at 4K:
.
I think it's fairly clear that the first image is much more preferable, even if it sports a less sharp IQ.
it's not about still screenshots
high res makes things look much better in motion by making aliasing/flickering/etc. a lot less noticeable/distracting
Yeah. It's easy to get mixed up with AMD vs Nvidia flops. They use different methods to measure. Don't know why really.
Makes sense not to target 4K when most people have a 1080P TV, and those that do have a 4K TV aren't guaranteed to have a setup where they sit at a reasonable distance and have a large enough size to get the full benefit (though this equation is far less specific and distinct than those completely bullshit image comparisons that get posted).
As someone with a 75" 4K TV that I sit a moderate distance from this makes me sad though, people who haven't run modern games on a setup like this at native 4K really have no idea just how big of an impact that resolution jump can make to both the IQ and fine detail that is invisible at 1080p, it's not at all what you would think from simple screen shot comparisons when in motion.
No right answer really, except to say that I think upping the resolution (of both effects and native rendering) is the easiest thing for developers to do when dealing with multiple SKU's that feature largely different GPU performance so I expect that's what most games will feature, instead of being designed for 1080p with higher quality everything. The games will still have to run well on a 1.31 tflop XBox One after all.
all this seems binary... why are most people thinking yes or no and can't imagine all the declination possible between the two ?
Sure, undoubtedly there;s a different it's just that when you get 4x the power, do you want to waste it all on 4x the res? I would have thought that higher poly count, higher res textures, better AA and shaders would give a more preferable result.
For years we watched movies on DVD at a "poxy" 480/576p. They looked amazing (If not obviously as sharp as HD) and that was because of the realism. i.e. The shaders, the polycount meaning smoother surfaces and all that jazz.
If you say to me "I'll give you the Witcher as it stands in 4k" or "I'll give you a photorealistic Witcher at 1080 (Or even 480)" I know what I'd go for.
No, most definitely isn't. 1080p 60fps is easier on something as powerful as Scorpio than 4K 30fps. Far easier. I'm very happy to hear developers not falling into the 4K trap. This console's power will be far better utilized by not boxing your game into a 4K resolution requirement.
Interestingly they use it without virtual texturing and might have to do a lot of extra work due to it.AoS uses texture space shading. Its not very bandwidth efficient
I agree
But my question was why the need for a 4K *display* I would rather the cpu be thrown at down sampling, better AA and a dozen other things including frame rate or some mix of these.
Simply using all the cpu to fill 4K at 60fps seems to miss an opportunity to increase quality for 95% of current TV sets.
This is a good way of thinking. I think 4XMSAA 1080p to 4K trick gives the best of both worlds, mix that up with a temporal component and you are going smooth sailing with regards to IQ at 1080p or 4K.
The scorpio is the 480. Which is about 40-50% slower than the 1070. It's a 1080p/60fps higher settings card. Or a 4k/30 card at mid/high.
r480 is 5.5 scorpio is 6
I really struggle to take the pro 4K people seriously when they're almost certainly playing on a sample-and-hold display with atrocious motion resolution. Don't get me started on 2160p30.
I think it's fairly clear that the first image is much more preferable, even if it sports a less sharp IQ.
From what we've seen from the 480 it's not a good OC's card. I'm presuming it's not the 480.GPU Tflops change dependent on the clock speed. The base model of the 480 on PC is 1266mhz for instance, which means it's a 5.8 Tflop card non OC'd. So the Scorpio IMO is likely using the 480 level GPU in its APU at ~1300mhz.
Well they're really not different in reality. A TFlop is a measure of theoretical computational performance. The AMD flops and Nvidia flops are the same. Nvidia cards at least on DX11 simply performed much better. Therefore Nvidia Flops were considered to be better than AMD flops again at least for DX11. At least from my understanding.
GPU Tflops change dependent on the clock speed. The base model of the 480 on PC is 1266mhz for instance, which means it's a 5.8 Tflop card non OC'd. So the Scorpio IMO is likely using the 480 level GPU in its APU at ~1300mhz.