• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Inside Playstation 4 Pro: How sony made the first 4k games console

wait, there is a base Ps4 mode? I thought it was just gonna automatically upgrade/upscale everything that it can
Games need a patch for that. Over 700 titles already released, not all will get a patch. So the Pro adjust so the game is essentially running the same as it would on normal PS4.
 
As a developer let me just say no.

NO.

A lot of engines tie logic to framerates. What you're asking for is a total dismantling and re-haul of several engines just for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched? A game that will have been several years old by the time it releases?

Dynamic scaling is an entirely different thing altogether. Funny that you should applaud MS for doing dynamic scaling and prepping for the Scorpio considering Sony has also done the exact same thing with its own mid-gen titles with Pro patches.

But yeah, to mandate that game logic should not be tied to framerate is as crazy as mandating god-rays on all games, or not allowing chromatic aberration on all titles, etc. It would be absolutely insane to mandate. No no no no no.

Thank you, and it personally saddens me this post will be completely ignored.
 

renzolama

Member
As a developer let me just say no.

NO.

A lot of engines tie logic to framerates. What you're asking for is a total dismantling and re-haul of several engines just for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched? A game that will have been several years old by the time it releases?

Dynamic scaling is an entirely different thing altogether. Funny that you should applaud MS for doing dynamic scaling and prepping for the Scorpio considering Sony has also done the exact same thing with its own mid-gen titles with Pro patches.

But yeah, to mandate that game logic should not be tied to framerate is as crazy as mandating god-rays on all games, or not allowing chromatic aberration on all titles, etc. It would be absolutely insane to mandate. No no no no no.

As a developer, why in the world are you arguing in favor of bad design? Having framerate disconnected from game logic is absolutely nothing like mandating god-rays or chromatic abberation, one is a visual effect and the other is a core programming tenet. It's like arguing against object oriented programming or modularity. There is no situation in which having framerate disconnected from game logic is detrimental, but a million situations in which the opposite is.

"for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched"

This is called forwards compatibility, and it's something that pretty much everyone in the development world has agreed is a critical consideration for decades now. The fact that the console world is behind because those developers have been able to get away with avoiding modernization is a flaw, not a benefit that should be defended.
 

ethomaz

Banned
Finally read the full article.

The most interesting part even for future PC development is the ID tracking Sony is using... games developers can do really some magics with this type of fast info... I hope AMD/nVidia come with something similar to PC scheme.

And now I understood why the checkerboard works better than a "conventional" do... they info (data) available to use the checkerboard is about twice (at fast speed) that most checkerboard techs has... again due this last display buffer ID tracking.

The different implementations is so interesting with each developer doing something else... and some are really to look more into like the way they are doing in Spider-Man.

Before I forget... not that PC needs these checkerboard techs but the ID tracking can help to improve to quality of temporal AA near to match other better AA solutions without the trades in performance.

Pro leading the overall GPU development ahead again like they did with PS4.
 

Shin-Ra

Junior Member
It can output 4K, no other current console does.
PS3 outputs native 4K (3840x2160@24Hz) for a limited number of apps. PlayMemories displays native 3840x2160@30Hz for photos on the base PS4.

This part caught my eye:

"Because the PS4 Pro won't offer faster game load times (according to Cerny)"

What does the author and Cerny mean by that?
Read/write data throughput may be higher with the denser stock 1TB drive but gains may be dedicated to SHARE recording of video and screenshots at higher resolutions.

The chance of this ever happening is very slim.
Unless you have access to the system software team's feature roadmap, I don't know how you could possibly be so sure.
 

Unknown?

Member
Where does it say they didn't do anything to it?
Where does it say they did? So far they haven't announced any patch for it so it's just as likely they didn't do anything to it as they did. If it were on a game they announced already to have a patch or EVEN an interest in then I could see it being an early patch.
 

ethomaz

Banned
As a developer, why in the world are you arguing in favor of bad design? Having framerate disconnected from game logic is absolutely nothing like mandating god-rays or chromatic abberation, one is a visual effect and the other is a core programming tenet. It's like arguing against object oriented programming or modularity. There is no situation in which having framerate disconnected from game logic is detrimental, but a million situations in which the opposite is.

"for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched"

This is called forwards compatibility, and it's something that pretty much everyone in the development world has agreed is a critical consideration for decades now. The fact that the console world is behind because those developers have been able to get away with avoiding modernization is a flaw, not a benefit that should be defended.
What you say makes complete non-sense even on PC.

You can't code something to take advantage of somethings was not launched yet.... yeap games needs patches or apps in general to support new processors and that happens all the time... every new GPU launch there is drivers/patches to make the things stable on new hardware. Software receives patches to improve performance or add new features found in new CPUs.

That is how software development works and it will always works.

You add/track compatibility for old hardware... not for future hardware you don't even know how it will be lol.

Said that it is not the framerate the really issue using a stronger hardware to run the same code... in fact the games will just to fine in terms of framerate if you release the Pro power to all old games... but that generate others issues related to gaming logic where people can exploit new bugs or glitches that it won't happen in a normal logic cycle... that is the biggest danger because it can fuck some microtransaction or DLC validation and that cost a hell of money to companies.

Try to minimize everything it can do wrong in game logic due a stronger hardware is absolutely ideal and it is called what you say good compatibility for the old hardware.

I already see a lot of exemple of bugs happening because the CPU got a strong boost in performance in some key apps in PC world.... avoid that is a key of success and show that the Sony team is doing a quality work here.
 

ethomaz

Banned
This part caught my eye:

"Because the PS4 Pro won't offer faster game load times (according to Cerny)"

What does the author and Cerny mean by that?
The HDD is basically the same and how for 4k you need even more data read/write... so why somebody expect better load times on Pro to begin???

SDDs on Pro is another talk... that will do fast loadtimes... that is where the change begins on Pro or even PC.
 

Shin-Ra

Junior Member
During certain camera switches in Knack, I noticed AA quality improve a split-second after the viewpoint switch, leading me to believe they implemented something like Resident Evil 5's dynamic MSAA.

For a significant reduction in moire, which covers a broader surface area, they'd need to have deliberately updated the game to take advantage of greater resolutions and/or filtering. Unless the moire is caused by densely packed geometry edges which MSAA tackles.
 

ethomaz

Banned
hmm...so there are subtle improvements that don't require patching? doesn't that contradict what they've telling us.
Nope.

Polaris (GCN 1.4) do has improvements over GCN 1.1 and that is what I'm saying in older thread you will get slight better graphic processing results and 1-3 better framerate in some games. It is impossible even at the same clock make Polaris delivery the same result than older GCN 1.1.

Polares give you up to 10% performance improvement over GCN 1.1 and better effect/graphic processing.
 

chippy13

Member
As a developer let me just say no.

NO.

A lot of engines tie logic to framerates. What you're asking for is a total dismantling and re-haul of several engines just for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched? A game that will have been several years old by the time it releases?

Dynamic scaling is an entirely different thing altogether. Funny that you should applaud MS for doing dynamic scaling and prepping for the Scorpio considering Sony has also done the exact same thing with its own mid-gen titles with Pro patches.

But yeah, to mandate that game logic should not be tied to framerate is as crazy as mandating god-rays on all games, or not allowing chromatic aberration on all titles, etc. It would be absolutely insane to mandate. No no no no no.

Thank you for bringing in some common sense to this thread. Some of these comments have been ridiculous.
 

Shin-Ra

Junior Member
We know The Last of Us Remastered has a native 3840x2160 @30fps mode, but not what AA solution Naughty Dog's using.

I hope they have checkerboard 2160p @60fps with ID buffer assisted TAA, improved from Uncharted 4 (also helped by double framerate) and image quality comparable to Days Gone.

DaysGone_Screens_SeptEvent_3840x2160_01.png
 

dr_rus

Member
Nope.

Polaris (GCN 1.4) do has improvements over GCN 1.1 and that is what I'm saying in older thread you will get slight better graphic processing results and 1-3 better framerate in some games. It is impossible even at the same clock make Polaris delivery the same result than older GCN 1.1.

Polares give you up to 10% performance improvement over GCN 1.1 and better effect/graphic processing.

This is why I personally find all the talk on disabling half of CUs and downclocking to PS4 speeds really odd. It won't be the same regardless.
 

Humdinger

Member
And another quote about how native 4K doesn't matter all that much when the Pro gets "close enough".

Yup, that's Endgadget, Digital Foundry, and CNET all saying pretty much the same thing. Very reassuring. Of course, this will all be ignored in the console wars to follow.

For $399, with all the constraints in check, this is as good as it gets. Cerny knows his stuff.

He sure does. I'm quite impressed (and quite confused).

I wouldn't be surprised if 1080p turns out to be the preferred setting for pristine image quality (even for those of us with 4K TVs).

That would be funnily ironic. I'm holding off buying a 4K TV, because I like my 1080p well enough and see no reason to junk it and spend several thousand on a new TV. So it's good to hear about the 1080p benefits.

At the end of this article they mention Knack looking better without doing anything to it.

Reason enough to buy a Pro, I think. ;)
 

ethomaz

Banned
This is why I personally find all the talk on disabling half of CUs and downclocking to PS4 speeds really odd. It won't be the same regardless.
If I need to guess they choose the closest they could reach via hardware limitation.

They could choose some weird clock like 792Mhz to try to reach a similar performance of PS4's GPU but you need to remember not all Polaris parts got performance improvement and that could affect how the game run on Pro because the lower clock of some Polaris parts.

There are the parts that got improvements in Polaris:

Radeon%20Technologies%20Group_Graphics%202016-page-015_575px.jpg


Compute Engine, Scheduler, Rasterizer, Render Backend did not get updated in Polaris... and the some of these units running at 792Mhz could generate more issues than if you runs it at 800Mhz like PS4's GPU.
 

barit

Member
So no auto enhanced games on Pro? Sony wants me really to play something like Lords of Fallen on Pro with the same screen tearing and frames drop as on OG PS4? Are they fucking kidding me?
 

chippy13

Member
One of the things that I am still not fully understanding is how the pro modes work with a 4k TV. I get that the pro will recognize that I have a 4k tv but will I be able to choose whether to use the 4k mode or the 1080p with increase effects mode or will it always change to 4k?
 

Shane89

Member
We know The Last of Us Remastered has a native 3840x2160 @30fps mode, but not what AA solution Naughty Dog's using.

I hope they have checkerboard 2160p @60fps with ID buffer assisted TAA, improved from Uncharted 4 (also helped by double framerate) and image quality comparable to Days Gone.

DaysGone_Screens_SeptEvent_3840x2160_01.png

AA on 4k native image? Are you kidding me? it's almost useless.
 

ethomaz

Banned
One of the things that I am still not fully understanding is how the pro modes work with a 4k TV. I get that the pro will recognize that I have a 4k tv but will I be able to choose whether to use the 4k mode or the 1080p with increase effects mode or will it always change to 4k?
Actually you can choose the output resolution on PS4 settings... so yes you can choose output in 1080p manually even if you TV is 4k.

(Settings) > [Sound and Screen] > [Video Output Settings] > Resolution

5213_tn.jpg


There is a 4k option in Pro.
 

wotta

Member
A quote from Games Industry.

Posted this in another thread, this one is probably more relevant though.

"Cerny also explains how Pro can effectively produce over 8 teraflops."

Interestingly, when looking at the architectural improvements that PS4 Pro offers, Cerny pointed out that the way the system handles computing enables it to provide more than the base 4.2 teraflops that have been advertised. "It's possible to perform two 16-bit operations at the same time, instead of one 32-bit operation. In other words, with full floats, PS4 Pro has 4.2 teraflops of computational power. With half floats, it now has double that -- which is to say, 8.4 teraflops of computational power. As I'm sure you understand, this has the potential to radically increase the performance of games," he commented.
 

ethomaz

Banned
A quote from Games Industry.

Posted this in another thread, this one is probably more relevant though.

"Cerny also explains how Pro can effectively produce over 8 teraflops."
I was trying to find how faster is the FP16 in Polaris but I couldn't find anything... looks like Vega will have 2x the FP32 performance like Nvidia P100 while Polaris could be in line with GP104 that is 1/64 FP32 (fucking slower).


That is why nobody cares about FP16 in PC development.

I mean how Devs will handle that why use FP16 can broke your game in PC and do two codes for PC and Pro is too expensive... maybe only exclusives will try to use FP16.

Some better reference is how slow is FP16 on GTX 1080: http://www.anandtech.com/show/10325/the-nvidia-geforce-gtx-1080-and-1070-founders-edition-review/5
 

chippy13

Member
Actually you can choose the output resolution on PS4 settings... so yes you can choose output in 1080p manually even if you TV is 4k.

(Settings) > [Sound and Screen] > [Video Output Settings] > Resolution

5213_tn.jpg


There is a 4k option in Pro.

So I assume that if you switch the resolution in settings to 1080 before starting a game it will then run the higher effects, etc. version of the game. Hopefully they will have an option in game so that you don't have to keep going into settings to switch things around.
 

ethomaz

Banned
So I assume that if you switch the resolution in settings to 1080 before starting a game it will then run the higher effects, etc. version of the game. Hopefully they will have an option in game so that you don't have to keep going into settings to switch things around.
Yes... exactly.

GT5 on PS3 had two render modes too... it is not new... you just need to change this same settling on PS3 between 720p and 1080p to see the differences.

FFXIV on PS4 has two render modes (720p and 1080p) but instead the option is implemented ingame so you can change any time with only a blink in the screen during the change.
 
As a developer let me just say no.

NO.

A lot of engines tie logic to framerates. What you're asking for is a total dismantling and re-haul of several engines just for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched? A game that will have been several years old by the time it releases?
I can assure you that's not the case in any modern game or engine.

That's now engines ties logic to framerate these days.

Yes, it's still quite common for engines to have a the update loop in a constant rate, because it makes physics calculations way simpler (to model mostly), but no one does by locking it to processor cycles or any of the old stuff that usually locked the game to a single architecture. Specially now when even on consoles you can't count on having the same resources available 100% of the time.

Games and engines that have logic tied to framerate do so by ensuring that the update loop is called exactly 30 or 60 times (or in whatever framerate) in a second. If your game never slows down, that's easy, just keep calling the update loop every fixed mileseconds and you are good to go. For games that have slow downs they usually counter that effect by reducing the wait time, calling one, two, three or whatever extra updates loop they need before the next draw call.

If they didn't, anytime you saw a slowdown in any game the logic would go kabum, because even for the designed console framerate could vary.

So unless 100% of the Ps4 games have a completely locked framerate there's definitely improvement that could be had with the pro, not doubling the framerate by itself, but making sure no slowdown ever happens.

Which again is perfectly possible, and we are seeing already on xbone running 360 games, and the S running xbone games, without requiring any patch to accommodate the extra power.

Remember when onQ was talking crazy about about FP16?


Pepperidge Farm Remembers

They name dropping fp16 does not equals to what you were saying. You said they could almost effectively double their processing power by using fp16 and no one is making such claim. They are not even passing it up as a big deal, they are only saying that if your game already uses fp16 for whatever stuff it will have an increased throughput.

Your original claim that fp16 would make the pro gpu punch above it's heigh is still largely unverified. There are not a single title that would make you say that Pro shouldn't render it in 4k native but it is and on the other hand there are games having trouble in even doubling the resolution compared to Ps4 (As the article even says, from the 9 games doing checkerboard rendering only 4 managed to double the resolution). So in a sense, the paltry bandwidth increase is making the Pro gpu underperform if anything.

I really liked to hear about the extra buffer for tracking polygons and objects, it seems it makes IQ very high at a low cost, which in turns reduces the effect of the upscaling.
 

kaching

"GAF's biggest wanker"
As a developer, why in the world are you arguing in favor of bad design? Having framerate disconnected from game logic is absolutely nothing like mandating god-rays or chromatic abberation, one is a visual effect and the other is a core programming tenet. It's like arguing against object oriented programming or modularity. There is no situation in which having framerate disconnected from game logic is detrimental, but a million situations in which the opposite is.

"for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched"

This is called forwards compatibility, and it's something that pretty much everyone in the development world has agreed is a critical consideration for decades now. The fact that the console world is behind because those developers have been able to get away with avoiding modernization is a flaw, not a benefit that should be defended.
It seems overly glib to simply call this bad design if pretty much everyone in the development world supposedly agrees its bad design yet a significant portion do it anyway. AKA, there's clearly advantages to engaging in the practice. The console world is not a separate insular segment of developers who simply didn't get the memo - there's abundant overlap between console and PC development these days.
 

chippy13

Member
Yes... exactly.

GT5 on PS3 had two render modes too... it is not new... you just need to change this same settling on PS3 between 720p and 1080p to see the differences.

FFXIV on PS4 has two render modes (720p and 1080p) but instead the option is implemented ingame so you can change any time with only a blink in the screen during the change.

Awesome, thanks for the info.
 

vpance

Member
This is why I personally find all the talk on disabling half of CUs and downclocking to PS4 speeds really odd. It won't be the same regardless.

Frequency is the main problem I think. That needs to be close to OG's to ensure perfect compatibility. Not FPS.
 
I thought Cerny would be doing video interviews...hopefully he does more talking on the PS Blog or something. This is all pretty fascinating
 

ethomaz

Banned
It seems overly glib to simply call this bad design if pretty much everyone in the development world supposedly agrees its bad design yet a significant portion do it anyway. AKA, there's clearly advantages to engaging in the practice. The console world is not a separate insular segment of developers who simply didn't get the memo - there's abundant overlap between console and PC development these days.
Everything come down to time and cost.

If you are working in a fixed platform that you will spend more and need more time to delivery a game trying to "predict" future upgraded hardware (something new in the industry).

Nobody do that... you get better results with a simple and fast development focused on delivery best quality for that single hardware. If a new machine is launched and it has beneficies ($$$ return) to sell the game there they will do a port to that new machine... that is basically the patch for Pro.

You can make a question when you are developing a game... the device I'm developing will support X feature? No... then don't waste time coding it (some programmers code new features that won't be used in their free time like a hobby but not for commercial purpose).
 

dr_rus

Member
Frequency is the main problem I think. That needs to be close to OG's to ensure perfect compatibility. Not FPS.

That would mean that they don't have any other constant tick in the system but the GPU's frequency which is even more odd as I struggle to think of any engine which would use GPU's frequency as some sort of clock reference for other parts of the engine. CPU I can see as such reference (still a bad idea in general) but not GPU.
 

chubigans

y'all should be ashamed
As a developer, why in the world are you arguing in favor of bad design? Having framerate disconnected from game logic is absolutely nothing like mandating god-rays or chromatic abberation, one is a visual effect and the other is a core programming tenet. It's like arguing against object oriented programming or modularity. There is no situation in which having framerate disconnected from game logic is detrimental, but a million situations in which the opposite is.

It is when your engine is completely built in that way and would take months or even a year to redesign just for the purpose of *maybe* being compatible with a console in the future that doesn't exist in any form at the time of production.

Hyper Light Drifter just got a 30fps to 60fps patch, and it took months to reprogram the game. What you're asking for is one hundred percent unreasonable.

"for the purpose of...being able to take advantage of a console that only existed on paper when the PS4 launched"

This is called forwards compatibility, and it's something that pretty much everyone in the development world has agreed is a critical consideration for decades now. The fact that the console world is behind because those developers have been able to get away with avoiding modernization is a flaw, not a benefit that should be defended.

Decades? Hell naw, that is not true at all for game development. What developer was making a PS2 game back in the day and thinking, hey, we should create some extra assets just in case this game gets an HD port in the future at a resolution that we have no way of knowing will be standard or the kind of performance the game will have on completely exotic architecture.

No one thinks this way. Only more recently do we have a general idea where games are heading, tech wise, and that's only for the immediate 4k future. Will 120fps monitors skyrocket in sales before then? Or will 8k be a thing in ten years? Or will it all be VR/AR based? Who the hell knows, but I'm not wasting even a day trying to future proof for something that no one will benefit from for years, if they even do. Again, what you're asking for is madness, and no one is even doing this sans working on consoles that are in production or to be released very soon.

I can assure you that's not the case in any modern game or engine.

That's now engines ties logic to framerate these days.

Ah logic was the wrong word, but physics and such (the Bethesda engine that powers Fallout/Elder Scrolls).
 

ethomaz

Banned
That would mean that they don't have any other constant tick in the system but the GPU's frequency which is even more odd as I struggle to think of any engine which would use GPU's frequency as some sort of clock reference for other parts of the engine. CPU I can see as such reference (still a bad idea in general) but not GPU.
It is not like part of the engine but part of the GPU.

ROPs has it own clock.
SPs has it own clock.

This is only a example... GTX 280 runs ROPs at 602Mhz while the SPs at 1296MHz... I don't know the clocks for GTX 1080 because nVidia only share the SPs clock and in AMD each part of the GPU has it own clock (mostly of time they use the same base clock).

And like I showed not all parts of Polaris get upgraded and these parts could be slower than PS4's GPU if running at lower clocks.

That is all guess too.
 

Marmelade

Member
I can assure you that's not the case in any modern game or engine.

That's now engines ties logic to framerate these days.

Yes, it's still quite common for engines to have a the update loop in a constant rate, because it makes physics calculations way simpler (to model mostly), but no one does by locking it to processor cycles or any of the old stuff that usually locked the game to a single architecture. Specially now when even on consoles you can't count on having the same resources available 100% of the time.

Games and engines that have logic tied to framerate do so by ensuring that the update loop is called exactly 30 or 60 times (or in whatever framerate) in a second. If your game never slows down, that's easy, just keep calling the update loop every fixed mileseconds and you are good to go. For games that have slow downs they usually counter that effect by reducing the wait time, calling one, two, three or whatever extra updates loop they need before the next draw call.

If they didn't, anytime you saw a slowdown in any game the logic would go kabum, because even for the designed console framerate could vary.

So unless 100% of the Ps4 games have a completely locked framerate there's definitely improvement that could be had with the pro, not doubling the framerate by itself, but making sure no slowdown ever happens.

Which again is perfectly possible, and we are seeing already on xbone running 360 games, and the S running xbone games, without requiring any patch to accommodate the extra power.

Yeah that's my line of reasoning too and why, not being as tech literate as some of the guys here, I'm trying to underdand what could go wrong with a frequency bump.

I understand the whole "logic tied to framerate" thing but nobody's expecting a capped 30 fps game to run at 60.
But something more along the lines of a PS4 game that would sometimes drop to 25 because of a cpu or gpu bottleneck running at or closer to 30 on the pro.
 

jobboy

Member
i'm a bit scared about some games having a worse framerate on pro (Cod iw, mantis burn racing, killing floor 2...) I mean pushing the resolution race too hard looks it can hurt framerate and nobody seems to care of it
 

Marvel

could never
i'm a bit scared about some games having a worse framerate on pro (Cod iw, mantis burn racing, killing floor 2...) I mean pushing the resolution race too hard looks it can hurt framerate and nobody seems to care of it
I'll take higher res and gfx settings over fps all day everyday.
 
Top Bottom