I wouldn't be posting things like that if I weren't working on getting as much information as possible and reporting it.you are the jounalist with connections and a site to bring out this stuff to the people.. so make it happen..!
I wouldn't be posting things like that if I weren't working on getting as much information as possible and reporting it.you are the jounalist with connections and a site to bring out this stuff to the people.. so make it happen..!
follow the tweets and you find this...
oh my god..
Well, what we see now might be mixture of both inferior hardware+inferior dev tools
Lets say PS4 power is 100%, and Xbone is 40% now because of both factor.
If they fix the tools, Xbone power might rise to 65%.
That way, some claim next COD will be 900p might be true.
So those saying that ESRAM can be harnessed down the road are full of shit? If so, how do you know?
That makes sense now, does it?
follow the tweets and you find this...
oh my god..
I wouldn't be posting things like that if I weren't working on getting as much information as possible and reporting it.
Turn off the fish AI and add more dogs then bump the resolution.
I believe they are for the most part. Writing a game engine a certain way might help to go around some issues with the XB1 architecture, but at the end of the day you can't make something out of nothing. You can better utilize what you have, but you can't completely alter what you have. The PS4 has more to offer developers. Quite a bit more in fact, and I can assure you these launch games are utilizing everything they can on the PS4 either.
wait which version have the most graphics? I don't have time to read the whole thread.
Can you not post random twitter posts from random people who know fuck all about games consoles?
follow the tweets and you find this...
oh my god..
wait which version have the most graphics? I don't have time to read the whole thread.
I can't even fathom the mental gymnastics you have to do to believe this
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to.
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
These are dark times. People have posted things that are more insane while being 100% serious. I realize my fault though.I can't fathom how you're taking it seriously
Always with the specs. Why not come up with a new reason?PS4 has a better GPU.
PS4 has much faster RAM, and no ESRAM bottleneck.
This has been known for months.
Sorry. From Major Nelsons time line. Found it... funny.
The whole thing is pretty funny if you ask me:
https://twitter.com/majornelson/status/395876512487854080
In BF3 the UI got smaller as the resolution went higher.
follow the tweets and you find this...
oh my god..
follow the tweets and you find this...
oh my god..
I think you mean games using deferred rendering are having issues with fitting the framebuffer within the 32MB ESRAM at 1080p?This is almost certainly known right now. The XB1's esRAM is a bottleneck on memory that the PS4 simply does not have. If a game engine utilizes a forward rendering solution (and many games do), that esRAM only allows the framebuffer to be 32mb, and that makes hitting 1080p very difficult.
The exclusive XB1 games may find ways around this, by customizing the engine to the XB1's weaknesses, as we're seeing with Forza, which runs at 1080p/60fps. But most devs are not going to go out of their way to work around the XB1's problems, so we're getting the results we're getting.
Thanks for the response. A few more questions; how do you know that this is the case? Do you have experience with ESRAM?
I think you mean games using deferred rendering are having issues with fitting the framebuffer within the 32MB ESRAM at 1080p?
Oh it definitely can be hard at times to tell the difference. There's usually at least one line that gives it away though. In this case having crushed blacks being an Xbox One exclusive feature was the main standoutThese are dark times. People have posted things that are more insane while being 100% serious. I realize my fault though.
This is where it originated from.
GAF > Internet (reddit ?) > GAF
The cycle of life indeed.
Also, anyone else notice the lack of talk about Xbox live compute from certain people these days? Maybe dem pixels can't be rendered in the clouds after all.
the mac versionwait which version have the most graphics? I don't have time to read the whole thread.
Nothing irks me more than ignorance.
Can't stand stupidity.
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
Hahaha justification for having lower native res. Scaler is inferior!
I assume you're looking into it. I'd be interested to get some answers.The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
Well, Xbox One has svogi version 1.4 where the PS4 has version 1.3. 1.4 added texticular occultation and narrowed the subpixel PWM depth rate from ~1.7Gf/s down to about ~1.2Gf/s (depending on which benchmark you believe). That along with the compressed vertex sparse tree matrix flyweight algorithm patented by MS and only in the Xbox One flavor of DX 11.2+ is going to give them a substantial bandwidth advantage once developers figure it all out. The November dev kits go out tomorrow so maybe too late for launch but E3 is going to be interesting
Hope you find something out, keep us posted.The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
watWell, Xbox One has svogi version 1.4 where the PS4 has version 1.3. 1.4 added texticular occultation and narrowed the subpixel PWM depth rate from ~1.7Gf/s down to about ~1.2Gf/s (depending on which benchmark you believe). That along with the compressed vertex sparse tree matrix flyweight algorithm patented by MS and only in the Xbox One flavor of DX 11.2+ is going to give them a substantial bandwidth advantage once developers figure it all out. The November dev kits go out tomorrow so maybe too late for launch but E3 is going to be interesting
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
I'll just quote this again, and add that the whole idea that you can distinguish between software and hardware scaling on a modern GPU based on the quality of the output is bullshit anyways. You can write a software scaling algorithm which creates a higher quality output than any hardware scaler I know of.I don't know if I'm LTTP, but I see this Xbox scaler secret sauce FUD being stated by a bunch of posters, and it's time for some facts:
The PS3, PS4, 360, One ALL have hardware scalers. These scalers are simply GPU fixed-function blocks in all of them and they don't cost anything to use. The scalers are not separate chips, they're just a section inside the GPU.
The PS3's RSX has a hardware scaler that's shitty and broken as it cannot scale an image both vertically and horizontally. It can only scale horizontally. It's being used in such games like GT5 and Motorstorm Apocalypse (2D mode) where it stretches a 1280x1080 native render to 1920x1080, or at Wipeout where the resolution is dynamic but only scaled horizontally between 960x1080 through 1280x1080, 1600x1080 etc, only through the horizontal axis.
This means that any game that's not rendering at somethingx1080 or native 720p is being upscaled via software on the PS3. This includes all the COD games, GTA4, Crysis 2,3, etc. This is done at a minor cost to performance and more importantly memory, which is very precious on the PS3. This also explains why most PS3 games output at 720p, as there is no way to scale 720p to 1080p in hardware in the PS3 and it leaves it up to your TV to scale it. ANY SOURCE that's not 1080p that you watch on a 1080p TV is scaled to 1080p by your TV no matter what the input is, unless you letterbox it and use only half your TV to view it, which nobody does.
The lack of a proper scaler that works in both dimensions in the PS3 is also the reason why some games render at 480p when you don't have the 720p output option checked in the preferences, like some people did 8 years ago when there were many TV's that supported 1080i (1080i is the same resolution as 1080p, it's just not progressive) but not 720p. Uncharted 2 will render at 960x1080 and use the RSX's scaler if you disable 720p and enable 1080. Towards the end of the generation, most games on PS3 just used a software scaling solution that cost very little performance, and more importantly some memory. Also even very little performance can make a difference when you need all of it, obviously.
The Xbox 360 did not have any of these problems because the AMD GPU inside can scale any resolution to 1080p just fine.
The PS4 and Xbox One have the same hardware scalers that exist in ALL AMD 7xxx series cards and can scale any output resolution to 1080p (and maybe beyond to 4K). Those scaling hardware blocks are GPU components just like the CU's and ROPS.
The reason for the PS4 scaler FUD was due to DF not really knowing what they're talking about and saying this in their BF4 face-off:
http://www.eurogamer.net/articles/digitalfoundry-battlefield-4-next-gen-vs-pc-face-off-preview
"This should surely be a home run for Sony's console, but what is likely to be a software-based upscale to 1080p delivers less-than-stellar returns, and for better or worse leaves the Xbox One with an often crisper looking, albeit much more aliased image."
This DF quote, ladies and gentlemen, proves beyond a reasonable doubt, that DF DO NOT KNOW WHAT THEY'RE TALKING ABOUT. DF looked at the Xbox One's incorrect gamma and sharpening filter, PS4 motion blur, and the FXAA solution in the PS4 version that slightly blurs textures and guessed immediately went Xbox super awesome scaler > PS4, which is false. They simply projected the scaling issues from the PS3 to the PS4 and incorrectly guessed that PS4 is doing a software upscale. THERE IS NOTHING COMMON ABOUT SCALING BETWEEN THE PS3 AND PS4, THEY USE DIFFERENT GPU VENDORS! All 7xxx series cards and APU's based on them have hardware scalers and developers have no reason to not use them as they're free.
As far as scaling games, both consoles have equivalent scalers. Only difference is MS can scale 3 different outputs, required for Game, UI, Snap, compared to 2 for PS4.
Sorry for the long read but this "DF software upscaler FUD" really pissed me off and I wanted to set things straight.
I think you mean games using deferred rendering are having issues with fitting the frame buffer within the 32MB ESRAM at 1080p?
People are forgetting that Microsoft has a deal with developers that no rival consoles may have more graphics than them. That means that the PS4 and XBox One version of Ghosts has the same amount of graphics. But keep in mind the PS4 is 1080p which means the graphics are stretched, while XBox One's 720p is the sweet spot and graphics are much closer together which gives a better-looking, more succinct experience.
In conclusion, if you run the same game on 1080p hardware and 720p hardware, the 720p will look way better. Now sprinkle that with some crushed blacks (XBox One exclusive feature) and the games look-not next-gen- but neo-gen (that's more than next-gen).
Hope this clears stuff up.
This type of people can't and won't get informed, sadly.
They can't possible accept that their favourite brand of console is fucked up this time around. So they'll just stick to reading stuff like mrxmedia's rubbish, which tells you for instance that HANA 2 (lolz) can even double FPS at no cost, even if it goes against all logic and reason.
They're a bit like creationists.
wait which version have the most graphics? I don't have time to read the whole thread.
This whole "joke" post/GIF/screenshot thing is getting really fucking tiresome.
Console-war in-jokes. Laugh at them, laugh at people who don't get them.
Grow up.
Always with the specs. Why not come up with a new reason?
Specs specs specs, BORING. It's all about the Ryse gameplay.
If you release pigs into a acornwood (or a beechnutwood) mutually owned by you and at least one more, and exceeded your quota of allowed pigs, you will have to pay a fine for each each pig to the other owners and to restore any damages caused by the extra pigs.
follow the tweets and you find this...
oh my god..
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.
I'm not a developer at all, but I've read technical breakdowns for years on sites like Ars Technica, and Digital Foundry, to the point where I have a solid understanding of how this stuff works. Anyone who knows more than I do could certainly show up and knock down the things I'm saying, but I don't believe you'll see that happen.
This type of people can't and won't get informed, sadly.
They can't possible accept that their favourite brand of console is fucked up this time around. So they'll just stick to reading stuff like mrxmedia's rubbish, which tells you for instance that HANA 2 (lolz) can even double FPS at no cost, even if it goes against all logic and reason.
They're a bit like creationists.
The more interesting question here - and something I don't think we know yet - is why the game runs at 720p on XB1 and 1080p on PS4. Now that's a question I want to find the answer to. Everyone's assuming the explanation is that IW had to sacrifice resolution to get the XB1 version performing well enough, and that might be the case, but there could be other factors here, too. Maybe they just got XB1 devtools late and didn't have enough time to acclimate.