• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PES 2015 : 1080P 60 FPS on PS4 - 720P 60 FPS on X1

Biker19

Banned
That didn't happen the last time on the PS3, when all late 1st party titles ran at 720p and pushed the system further than any of the 3rd parties did. The trade-off was made elsewhere than lowering the resolution. That's why I don't think it'll happen this time either, at least when it comes to SCE studios.

We may not see it now, but once 3rd party developers starts putting out newer, advanced engines, the differences are going to definitely start showing between the two consoles.
 

Melchiah

Member
We may not see it now, but once 3rd party developers starts putting out newer, advanced engines, the differences are going to definitely start showing between the two consoles.

I wasn't saying the differences would begin to diminish, or comparing PS4 and XB1. I was saying the PS4 games don't seem to be likely going sub-1080p, as SCE's PS3 games ran at 720p until the very end.
 

onQ123

Member
I still don't buy the excuse. If KI was important to Microsoft, they'd make the resources available.

KI is only on Xbox One so it's not going to lose them any sells because it's 720P. So there isn't a real push from MS to try to make the game run at a higher resolution.
 
2424232-6588996895-iydgE.gif

hehe, brilliant gif.

Anyway, it seems odd. Isn't Fox engine supposed to work well on multiple platforms, including PC?
 

twobear

sputum-flecked apoplexy
We may not see it now, but once 3rd party developers starts putting out newer, advanced engines, the differences are going to definitely start showing between the two consoles.

Well, except that 720p to 1080p is much, much larger than the 40% paper spec difference.
 

CLEEK

Member
Yeah, and you'll notice a commonality amongst them. They're launch games.

According to reports the SDK was pretty poor leading up to launch, and the Kinect was taking up 10% of the GPU time. Those two things have now been fixed, which is why 720p games have been so rare since launch.

Fox Engine uses deferred rendering. The 32MB eSRAM is the main issues here. Nothing to do with launch games, SDKs, GPU reservations, or any other technical bottleneck the Xbox One has compared to the PS4.

The more effects you want, the larger the bytes per pixel of the g-buffer gets. With the PS4, this isn't an issue, as you have as much of the available GDDR5 to play with as you want.

So you have the option to either reduce the resolution to one where the g-buffer fits into the 32MB eSRAM, or you have to vastly reduce the effects to reduce the bytes per pixel of the g-buffer. The former gives a consistent look across formats (resolution aside), the latter requires far more work and you effectively having a separate engine for the Xbox One version. With a multi format game, it would be crazy to go with the second option over the first.

There are deferred rendering games that run at 900p, or even 1080p, on the Xbox One. That means the size of the g-buffer for these games is much less than what Konami have decided on with MGS and PES. That's all.
 

Journey

Banned
Fox Engine uses deferred rendering. The 32MB eSRAM is the main issues here. Nothing to do with launch games, SDKs, GPU reservations, or any other technical bottleneck the Xbox One has compared to the PS4.

The more effects you want, the larger the bytes per pixel of the g-buffer gets. With the PS4, this isn't an issue, as you have as much of the available GDDR5 to play with as you want.

So you have the option to either reduce the resolution to one where the g-buffer fits into the 32MB eSRAM, or you have to vastly reduce the effects to reduce the bytes per pixel of the g-buffer. The former gives a consistent look across formats (resolution aside), the latter requires far more work and you effectively having a separate engine for the Xbox One version. With a multi format game, it would be crazy to go with the second option over the first.

There are deferred rendering games that run at 900p, or even 1080p, on the Xbox One. That means the size of the g-buffer for these games is much less than what Konami have decided on with MGS and PES. That's all.



Correct me if I'm wrong, but isn't deferred rendering used as a performance saving measure as opposed to a more taxing forward renderer? At least that's what I thought I was reading in AMDs tech demo description way back when they launched the 7970.

AMD Radeon HD 7900 series DirectX 11 tech demo that demonstrates the rendering of complex lighting that would normally require a deferred rendering path for reasonable performance, in a forward renderer, thus maintaining universal hardware MSAA support and proper alpha blending support. This technique also supports one bounce global illumination effects by spawning virtual point light sources where light strikes a surface.

https://www.youtube.com/watch?v=4gIq-XD5uA8

edit: just a technical question that has nothing to do with the gap between X1 and PS4.
 

benny_a

extra source of jiggaflops
Correct me if I'm wrong, but isn't deferred rendering used as a performance saving measure as opposed to a more taxing forward renderer? At least that's what I thought I was reading in AMDs tech demo description way back when they launched the 7970.

https://www.youtube.com/watch?v=4gIq-XD5uA8

edit: just a technical question that has nothing to do with the gap between X1 and PS4.
AMD's Leo demo is demonstrating Forward+. It's not just Forward.

It goes roughly like this: Forward -> Deferred -> Forward Plus.
(The Plus means it takes things that Deferred does and merges it with Forward.)
 

Gestault

Member
I know. And the differences are going to wind up being in PS4's favor.

I think they were saying this isn't the best indicator of the underlying spec disparity so much as it is a particular challenge of Fox Engine, since the the performance gap is greater than should result from the raw hardware stats.
 
I think they were saying this isn't the best indicator of the underlying spec disparity so much as it is a particular challenge of Fox Engine, since the the performance gap is greater than should result from the raw hardware stats.

Yes. That is what I interpreted, too.
 
Killer Instinct Seasons 2's still going to be 720p60, no?

I guess it never occurred to you that Microsoft might view KI as budget project, and when hiring out a new developer for DLC they weren't about to pay for them to overhaul the game engine? It seems rather obvious.
 

twobear

sputum-flecked apoplexy
I know. And the differences are going to wind up being in PS4's favor.

So you think that the difference is going to get bigger than 720p to 1080p for the same game?

Because that's a pretty huge difference, and much larger than the paper specs would suggest.
 

Compbros

Member
I guess it never occurred to you that Microsoft might view KI as budget project, and when hiring out a new developer for DLC they weren't about to pay for them to overhaul the game engine? It seems rather obvious.



Do we have any reports of the MS engineers being contracted by the devs/pubs to help out their game on XBO? If not then it's MS paying out of pocket so a third-party game that they'll see little money on each sale looks/preforms better on their console for public perception or what have you. Seems like doing it for an in-house, even for a "budget project" would be the better play or, at least, worth whatever money to bump up resolution.


Regardless, KI S2 has had a rather large overhaul as they've changed the UI, added some combat mechanics, developed new characters, and completely rebalanced the game on top of quite a few other changes.
 
Correct me if I'm wrong, but isn't deferred rendering used as a performance saving measure as opposed to a more taxing forward renderer? At least that's what I thought I was reading in AMDs tech demo description way back when they launched the 7970.



https://www.youtube.com/watch?v=4gIq-XD5uA8

edit: just a technical question that has nothing to do with the gap between X1 and PS4.

I am far from an expert in graphics but it is my understanding that deferred rendering trades memory for performance when compared to forward rendering. So if you have the memory to spare you can get the performance increase. However if memory is tight then then you won't be able to make use of deferred rendering unless you are willing to sacrifice image quality.
 
Do we have any reports of the MS engineers being contracted by the devs/pubs to help out their game on XBO? If not then it's MS paying out of pocket so a third-party game that they'll see little money on each sale looks/preforms better on their console for public perception or what have you. Seems like doing it for an in-house, even for a "budget project" would be the better play or, at least, worth whatever money to bump up resolution.


Regardless, KI S2 has had a rather large overhaul as they've changed the UI, added some combat mechanics, developed new characters, and completely rebalanced the game on top of quite a few other changes.

Destiny and Killer Instinct. One is a massive high profile new IP, with millions of dollars in advertising being spent on it, and the other is a downloadable game that can be downloaded for free. There is your answer.
 

CLEEK

Member
So you think that the difference is going to get bigger than 720p to 1080p for the same game?

Because that's a pretty huge difference, and much larger than the paper specs would suggest.

Depends on the paper spec. The Xbone's biggest bottleneck is its 32MB fast RAM. The PS4 has ~5GB of fast RAM. That is a massive difference, and the root cause of the resolution differences.

It's basic sums. I mentioned this in a previous post. Infamous SS is a good example, as they provided they technical details of their engine. They had a g-buffer of 40 bytes per pixel and output at 1080p. This needs to be stored in fast RAM, which in the case of the Xbox, is the eSRAM.

1920x1080x40 = 82944000 bytes (79.1MB)

If they'd dropped the resolution to 720p, the g-buffer would have been:

1280x720x40 = 36864000 bytes (35.1MB)

For 720p, the g-buffer can't be bigger than 36 BPP to fit into the 32MB eSRAM. If they want a 1080p image, the g-buffer can be a maximum of 16 BPP. This vastly reduces what effects can be applied.
 

Compbros

Member
Destiny or Killer Instinct. One is a massive high profile new IP, with millions of dollars in advertising being spent on it, and the other is a downloadable game that can be downloaded for free.


Also Diablo 3 which saw a lot of bad press from the PC launch. Also it can be downloaded for free but practically every character costs money. $5 per character that can never be returned or bought used and all goes to MS vs. Destiny, Diablo 3, and potentially PES. Making their own game look as good as possible seems like a better move than making Diablo 3 1080p "because".
 

CLEEK

Member
I was reading about The Order 1886. They started off with deferred rendering, but with all the material simulations and effects they wanted, the g-buffer became too big even for the PS4, so they reverted back to forward rendering. So you can have next gen games without using deferred. It just depends on the game and what the devs are wanting.
 

Alchemy

Member
Well, except that 720p to 1080p is much, much larger than the 40% paper spec difference.

Its not about the total spec difference, it is about a bottleneck. If the Fox Engine is heavily dependent on render targets, then it will be horribly bottlenecked by the ESRAM in the Xbone. GPU basically means nothing at that point, memory bandwidth controls what you can do entirely.

It basically means completely rewriting the engine to work better on the Xbone.
 
I never knew GAF had so many FOX engine experts til this thread.

Also Diablo 3 which saw a lot of bad press from the PC launch. Also it can be downloaded for free but practically every character costs money. $5 per character that can never be returned or bought used and all goes to MS vs. Destiny, Diablo 3, and potentially PES. Making their own game look as good as possible seems like a better move than making Diablo 3 1080p "because".

Diablo 3 - has sold millions of copies on PC, and now sells for $60 on XB1.

Destiny - has sold millions of copies at $60 a pop.

Killer Instinct - is a free downloadable game that offers downloadable extras.

Do we really need to go over this again?
 

CLEEK

Member
I never knew GAF had so many FOX engine experts til this thread.

It's not about being experts on the Fox Engine (although there are technical articles that cover it in depth if you want to read about it). It's about knowing the difference between forward and deferred rendering. No different from going on a car forum and finding people know the difference between petrol and diesel engines.

I am far from an expert in graphics but it is my understanding that deferred rendering trades memory for performance when compared to forward rendering. So if you have the memory to spare you can get the performance increase. However if memory is tight then then you won't be able to make use of deferred rendering unless you are willing to sacrifice image quality.

This explains the difference between deferred and forward. Fast memory is requirement, which is the limiting factor in the Xbone.

Which to Pick?

The short answer is, if you are using many dynamic lights then you should use deferred rendering. However, there are some significant drawbacks:

This process requires a video card with multiple render targets. Old video cards don't have this, so it won't work on them. There is no workaround for this.

It requires high bandwidth. You're sending big buffers around and old video cards, again, might not be able to handle this. There is no workaround for this, either.

You can't use transparent objects. (Unless you combine deferred rendering with Forward Rendering for just those transparent objects; then you can work around this issue.)

There's no anti-aliasing. Well, some engines would have you believe that, but there are solutions to this problem: edge detection, FXAA.

Only one type of material is allowed, unless you use a modification of deferred rendering called Deferred Lighting.

Shadows are still dependent on the number of lights, and deferred rendering does not solve anything here.

If you don't have many lights or want to be able to run on older hardware, then you should stick with forward rendering and replace your many lights with static light maps. The results can still look amazing.
 

Compbros

Member
I never knew GAF had so many FOX engine experts til this thread.



Diablo 3 - has sold millions of copies on PC, and now sells for $60 on XB1.

Destiny - has sold millions of copies at $60 a pop.

Killer Instinct - is a free downloadable game that offers downloadable extras.

Do we really need to go over this again?



D3's audience is on PC and sells for $60 that MS sees very little of.

Destiny has and, again, MS sees little of it.

KI gives you a character or two out of 10, to buy everything KI right now it would be $60 and MS sees all of that.


I'm just not understanding why titles they get little money from they're willing to send engineers out of pocket for a 180p bump but they don't feel the need to do the same for their own titles.
 
D3's audience is on PC and sells for $60 that MS sees very little of.

Destiny has and, again, MS sees little of it.

KI gives you a character or two out of 10, to buy everything KI right now it would be $60 and MS sees all of that.


I'm just not understanding why titles they get little money from they're willing to send engineers out of pocket for a 180p bump but they don't feel the need to do the same for their own titles.

Because the Killer Instinct engine was completed more than a year ago, and then Microsoft had to find a new team to hand that engine over to for the new DLC. Asking a new team to take someone else's engine and rebuild the rendering pipeline, etc, is a big ask, especially when Microsoft doesn't see this game as a big money maker. This is a fan service, small time project, meant to garner good will and nostalgia from old school gamers.

A lot of people actually bought new consoles to play Destiny. Diablo 3 is a good selling franchise. They're far more important to Microsoft than KI.
 
I was reading about The Order 1886. They started off with deferred rendering, but with all the material simulations and effects they wanted, the g-buffer became too big even for the PS4, so they reverted back to forward rendering. So you can have next gen games without using deferred. It just depends on the game and what the devs are wanting.

Source on this?
I thought they were using Forward+ because they wanted to add msaa?
 

Compbros

Member
Because the Killer Instinct engine was completed more than a year ago, and then Microsoft had to find a new team to hand that engine over to for the new DLC. Asking a new team to take someone else's engine and rebuild the rendering pipeline, etc, is a big ask, especially when Microsoft doesn't seem this game as a big money maker. This is a fan service, small time project, meant to garner good will and nostalgia from old school gamers.

A lot of people actually bought new consoles to play Destiny. Diablo 3 is a good selling franchise. They're far more important to Microsoft than KI.



But sending people that don't know anything about the engine and helping them up the resolution isn't a big ask? This new team has done so much more to the engine than just adding characters and, during that time, MS didn't see the need to send engineers to help them out?




Yes they did and D3 is a good selling franchise on PC, there's no telling for new consoles. Even so what about for PES if that's what's happening? A franchise that doesn't sell gangbusters being helped out by MS because the negativity that it has garnered. KI is less important than PES?


Edit: Is it even about what's important to MS as Watch Dogs didn't get the bump even with all the press and pre-orders it got.
 

twobear

sputum-flecked apoplexy
Depends on the paper spec. The Xbone's biggest bottleneck is its 32MB fast RAM. The PS4 has ~5GB of fast RAM. That is a massive difference, and the root cause of the resolution differences.

It's basic sums. I mentioned this in a previous post. Infamous SS is a good example, as they provided they technical details of their engine. They had a g-buffer of 40 bytes per pixel and output at 1080p. This needs to be stored in fast RAM, which in the case of the Xbox, is the eSRAM.

1920x1080x40 = 82944000 bytes (79.1MB)

If they'd dropped the resolution to 720p, the g-buffer would have been:

1280x720x40 = 36864000 bytes (35.1MB)

For 720p, the g-buffer can't be bigger than 36 BPP to fit into the 32MB eSRAM. If they want a 1080p image, the g-buffer can be a maximum of 16 BPP. This vastly reduces what effects can be applied.

Right, but my point was more that if you have to drop the resolution to 720p, then aren't you just leaving a whole bunch of Xbone's hardware totally underutilised?

Also, unless I'm mistaken, does this mean that even at 720p, SS's g-buffer couldn't fit into the eSRAM? Does this mean that we might actually see sub-720p Xbone games towards the end of the generation?

Truly a spectacular fuck-up by MS. But I guess at this point it's really just reiterating a point made many, many times since we learned last year that they'd skimped out on hardware.
 
Right, but my point was more that if you have to drop the resolution to 720p, then aren't you just leaving a whole bunch of Xbone's hardware totally underutilised?

Also, unless I'm mistaken, does this mean that even at 720p, SS's g-buffer couldn't fit into the eSRAM? Does this mean that we might actually see sub-720p Xbone games towards the end of the generation?

Truly a spectacular fuck-up by MS. But I guess at this point it's really just reiterating a point made many, many times since we learned last year that they'd skimped out on hardware.

That is why they call it a bottleneck. :)
 

CLEEK

Member
Right, but my point was more that if you have to drop the resolution to 720p, then aren't you just leaving a whole bunch of Xbone's hardware totally underutilised?

Also, unless I'm mistaken, does this mean that even at 720p, SS's g-buffer couldn't fit into the eSRAM? Does this mean that we might actually see sub-720p Xbone games towards the end of the generation?

Truly a spectacular fuck-up by MS. But I guess at this point it's really just reiterating a point made many, many times since we learned last year that they'd skimped out on hardware.

Infamous was a PS4 exclusive, so the engine was tailored to the strengths of the console. One of the main ones being an abundance of high bandwidth memory. DAT GDDR5 etc.

If you;re developing a multi platform game, you have to balance between the stronger and weaker hardware. Unless you have the capacity to develop two different engines, you have trade offs on both formats. the PS4 won't be pushed as hard as it could be, and the XB1 will have downgrades somewhere.
 
I look forward to lambasting Digtial foundarys face off when they blame the developers again and not praise the strengths of the PS4s hardware like they did with Bayonetta back in the day for 360.
 
Infamous was a PS4 exclusive, so the engine was tailored to the strengths of the console. One of the main ones being an abundance of high bandwidth memory. DAT GDDR5 etc.

If you;re developing a multi platform game, you have to balance between the stronger and weaker hardware. Unless you have the capacity to develop two different engines, you have trade offs on both formats. the PS4 won't be pushed as hard as it could be, and the XB1 will have downgrades somewhere.

You don't need two separate engines. You just have two different render pipes. This is already the case for the fox engine. Adding proper esram support is not another huge step.
 

Ateron

Member
This reminds me of that situation at the beginning of the gen where Madden (I believe that was the game) ran at double the framerate on the 360. It had nothing to do with the ps3 lacking power, the engine was more suited for the 360 and the cell made things more complicated. I think the problem was rectified on the following releases..

I downloaded the demo yesterday on the ps4, enjoyed it a lot but I really don't see anything so demanding that the x1 has to render at 720p. Probably tweaking the engine a bit would net them better results on smaller bandwith but I don't think it's something that Konami can pull over night, so maybe MGSV TPP will suffer from the same problem, being the same engine and all.

There are definite differences between both consoles hardware, but this is being blown up out of proportion, and I say that as a ps fan. This is definitely an engine problem, not a raw power one. I'm not one to pull the lazy devs card often, but it's kinda pathetic to see a game like this run at 720p. Maybe they need to rewrite the engine to take advantage of the x1, but I don't see that happening right away.
 

Alchemy

Member
Use the API that Microsoft provides. If this is already the case then make sure its being used properly. Not everything should be allocated in esram.

Changing the entire way you render an image is a pretty huge fucking task that completely changes how you optimize your game engine, the way data is stored and sorted and the like. There is no magic Microsoft API that can fix that. And when you're using deferred rendering you have to stick your render targets in fast memory, on the Xbone that leaves only the ESRAM.
 

Inuhanyou

Believes Dragon Quest is a franchise managed by Sony
There is a power gap between PS4 and XBox One but it is not that big. The engine is obviously not very optimized for the XOne.

That is specifically because of the architectural differences. Xbox one's bandwidth situation does not leave room for playing nice with deferred renderers at 1080p, PS4's does.

The power differential in combination with the Xbox one's components bottle-necking the console in certain scenarios would yield this result. Its a one two punch.
 

twobear

sputum-flecked apoplexy
Infamous was a PS4 exclusive, so the engine was tailored to the strengths of the console. One of the main ones being an abundance of high bandwidth memory. DAT GDDR5 etc.

If you;re developing a multi platform game, you have to balance between the stronger and weaker hardware. Unless you have the capacity to develop two different engines, you have trade offs on both formats. the PS4 won't be pushed as hard as it could be, and the XB1 will have downgrades somewhere.

Seems like in the opposite in this case, no? Xbone isn't being pushed as hard as it could be and has downgrades to boot.

But still, if SS is using more RAM for its g-buffer than you can feasibly use on Xbone even at 720p then that strongly suggests to me that Xbone is going to be destroyed later in the generation. I'd been of the belief that the gap wasn't going to get too massive but this has changed my opinion.
 
Changing the entire way you render an image is a pretty huge fucking task that completely changes how you optimize your game engine, the way data is stored and sorted and the like. There is no magic Microsoft API that can fix that. And when you're using deferred rendering you have to stick your render targets in fast memory, on the Xbone that leaves only the ESRAM.

Pretty huge ****Ing task? Not if the engine is designed properly. All of your drawing code should be independent of the rest of the engine.

This is obviously? already the case, as the engine is already using two different graphics APIs.

No one claimed it to be a magical API. It simply makes using the esram easy/easier. If it's being used properly is another question.

No reason to be rude
 

slapnuts

Junior Member
Oh sry, I forgot a dev is a robot that is not allowed to have his own opinion and favorites ... That surely disqualifies me.


You can not calculate it like that. It always depends on your engine. The raw power on PS4 is better. This is a simple fact. However, and I need to be very careful here with these kind of statements, but in its current state, the Azure providings will make some of you guys here on Gaf speechless in future. I know, I know ... "Dat cloud dat cloud". Well, it is still a long way to go and Gaf does not want to jump on the cloud train yet - and I can totally understand that because of lacking showings.
The "Cloud" comment has me scratching my head though, Azure is simply servers, heck "the cloud" is really nothing new neither other than putting it into more in-game use-age, correct me if i am wrong but my question is....if "the cloud" is something special and as you put may leave some of us here at Gaf speechless with future X1 games, wouldn't this tech be something Sony can also do? Again....i guess it all comes back to the same old question...what MS can do with software, optimizations, cloud,etc are all things Sony will also be doing to push the boundaries of its gaming console, in other words, as that old saying goes, a tooth for a tooth, an eye for an eye ...what ever MS can do to push for more advanced game titles..Sony can also do ....making everything come back full circle and we're back to square one...the only thing that really matters at the end of the day is hardware differences which is what it is and will always be....everything else is sorta moot because both MS and Sony will be pretty on par with optimizations and advancing software techniques in the coming years.

I guess i just don't buy this "Cloud" thing so much because if it were that ground breaking for gaming, PC's would of been doing this for PC gaming already but like i said correct me if i am wrong because i honestly could be.
 
This reminds me of that situation at the beginning of the gen where Madden (I believe that was the game) ran at double the framerate on the 360. It had nothing to do with the ps3 lacking power, the engine was more suited for the 360 and the cell made things more complicated. I think the problem was rectified on the following releases..

I downloaded the demo yesterday on the ps4, enjoyed it a lot but I really don't see anything so demanding that the x1 has to render at 720p. Probably tweaking the engine a bit would net them better results on smaller bandwith but I don't think it's something that Konami can pull over night, so maybe MGSV TPP will suffer from the same problem, being the same engine and all.

There are definite differences between both consoles hardware, but this is being blown up out of proportion, and I say that as a ps fan. This is definitely an engine problem, not a raw power one. I'm not one to pull the lazy devs card often, but it's kinda pathetic to see a game like this run at 720p. Maybe they need to rewrite the engine to take advantage of the x1, but I don't see that happening right away.


except ps3 was arguably more powerful than the 360, evident by the ps3 exclusives. so we got a lot of lazy devs arguments. gran turismo 6 even renders at 1280x1080, which says something when some 3rd party games can't even render at 720p. not to mention, the ps3 architecture was known to be hard to programming for.

now we have ps4 which is the more powerful hardware, so that alone tells you it's possible that they can easily achieve better results on ps4 sdk. ps4 is also easier to program for. so that's double the advantage. xbone should be challenging but it shouldn't be harder than the ps3 given that the xbone is very similar to the 360 setup.

if it were so easy to achieve, how come sub-1080p on xbone is the norm? you can't just blame konami or pull the lazy dev argument when half of xbone games are sub-1080p and they're not even 60 fps.
 
People are actually surprised at this shit?

It's probably sub-720p on PS3 and 720p on Xbone, just like GZ... just like every fox engine game coming out probably. This isn't anything new, and it was to be expected.

On another topic, I would really like to see a GZ on ps4 in person. I can't just imagine the difference as I've never been able to compare 720p/1080p on a TV.
 

Ateron

Member
except ps3 was arguably more powerful than the 360, evident by the ps3 exclusives. so we got a lot of lazy devs arguments. gran turismo 6 even renders at 1280x1080, which says something when some 3rd party games can't even render at 720p. not to mention, the ps3 architecture was known to be hard to programming for.

now we have ps4 which is the more powerful hardware, so that alone tells you it's possible that they can easily achieve better results on ps4 sdk. ps4 is also easier to program for. so that's double the advantage. xbone should be challenging but it shouldn't be harder than the ps3 given that the xbone is very similar to the 360 setup.

if it were so easy to achieve, how come sub-1080p on xbone is the norm? you can't just blame konami or pull the lazy dev argument when half of xbone games are sub-1080p and they're not even 60 fps.

True, but take a look at Fifa 15. It's 1080p/60 on both and I think it looks a bit better than Pes.

And I'm a PES fan since the ps1 days :)

Fox engine has its quirks and maybe they haven't had the time to address them yet. I think (and we know for a fact) that the ps4 is more powerful. If this discrepancy was only noticeable on MGS5, I would understand, but PES? It's really not a demanding game. So far other sports games have achieved res/fps parity (with the ps4 having the upper hand on consistency and effects), but still.
 

twobear

sputum-flecked apoplexy
People are actually surprised at this shit?

It's probably sub-720p on PS3 and 720p on Xbone, just like GZ... just like every fox engine game coming out probably. This isn't anything new, and it was to be expected.

On another topic, I would really like to see a GZ on ps4 in person. I can't just imagine the difference as I've never been able to compare 720p/1080p on a TV.

It's 720p/60fps on PS3.
 
True, but take a look at Fifa 15. It's 1080p/60 on both and I think it looks a bit better than Pes.

And I'm a Pes fan since the ps1 days :)


yeah but killer instinct, cod ghosts, bf4, dead rising are 720p/60, a lot of games are 900p/30 as well. so what if fifa was 1080p/60? just because they're of the same genre? no one calls out some 30fps fps (lol) just 'cause cod is 60fps.

xbox can't so anything. 32mb is the max they can put in and the box looks like a vcr system already.
 
Top Bottom