• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4K information (~2x GPU power w/ clock+, new CPU, price, tent. Q1 2017)

Anyone else worried that devs will make PS4k first priority over PS4, and they'll end up getting less attention and have technical issues?

A lot of people included me.

Let me take another crack at why this isn't much of a concern for me. Granted we are operating on a limited information set, and I will try and put things in as simple terms as I can, and will probably fail.

A 30 FPS game has literally twice the amount of time to render a frame as 60 FPS. It doesn't matter how powerful the machine you are using, you can do more in 33.3 ms than you can in 16.6 ms. Meaning to get a game targeted at 30 FPS on one machine, running at 60 FPS on another machine, you need double the performance, ignoring optimization and talking raw numbers.

Doubling your resource budget (again, we don't know specifics or CPU yet, but talking roughly here), means that any game that currently runs at 30 FPS, will be capable of running at 60 FPS with little to no work required. Almost as if this was done intentionally... like some sort of systems architect thought these things through. Granted this doesn't solve other problems like game systems being tied to framerate, but we are talking generally here.

It is true that a developer could be foolish enough to say FUCK IT we are using our entire resource budget, we're targeting 30 FPS on PS4K. Screw the PS4! We miiight get 20 FPS out of it if we really try, but who gives a shit WE HAVE MORE POWER MUWAHAHA. And also true this will become a thing at some point. I would estimate 3+ years after the PS4K launches when the install base is enough that it would make sense to do so. Around 6+ years after the PS4 launched. Just my own speculation of when these time tables will come into play, of course. PS4K could sell 100 gorillion units in the first year and everything I've said goes out the window... buuut probably not.

Any developer targeting 30 FPS on PS4K at launch or shortly after would be insane. It has an install base of 0, it is just a poor business decision. First party specifically trying to push the PS4K with exclusive content could be a thing, sure. I would suspect that would be very limited. Naughty Dog, Santa Monica, Guerrilla, etc.. as a whole are not going to ignore 35 million PS4's out there, they just aren't.

99% of your video games are just going to run better on PS4K. 99% will run exactly as they did on the PS4 if the PS4K never existed for the standard lifespan of a console. You are concerning yourself with a fraction of a possibility.

edit: Quick addition. It is also true that developers could choose to throw the extra power at all sorts of things, frame rate is the easiest for people to wrap their head around, and you can bet your ass it is going to be incredibly common to throw it at the number one complaint when it comes to console power... frame rate. My 99% figure is obviously hyperbole, interchange it with incredibly common.
 

clintar

Member
You really don't seem to understand what you're talking about. This is not the first time it seems like you read a buzz word and latched on to it without fully understanding what you're reading. I ask again, tell me the difference between native rendering, up rendering and upscaling. All you did was say it's up-rendered without explaining what the difference is between these three techniques. If you can't explain the difference on a basic technical level, how can you even be sure you understand what they mean? At least a few of us here have asked you to define the three things and how they differ. For starters games in itself aren't a native resolution. They are set to render at a native resolution. Just like PC games aren't defined by some set resolution.
Well he probably figures it should be obvious. Only difference between native rendering and up-rendering is up-rendering is done by taking something that is trying to render at a certain resolution and translating to a higher screen space resolution at some point in the rendering process while the program still thinks it is rendering in the original resolution. Technically, the process could be seen as naively rendering at a higher resolution. I could see that as being a default option for games that weren't made to work for ps4k to be enhanced. Up-render to some higher resolution along with upscaling the final result to 4k much like the way ps2 games run on ps4. Anyway, I'm guessing there will be some combination of techniques for ps4k games to get to 4k. Think it was an nxgamer video that talks about temporal reproduction and that kind of thing helping along with upscaling.
 
Let me take another crack at why this isn't much of a concern for me. Granted we are operating on a limited information set, and I will try and put things in as simple terms as I can, and will probably fail.

A 30 FPS game has literally twice the amount of time to render a frame as 60 FPS. It doesn't matter how powerful the machine you are using, you can do more in 33.3 ms than you can in 16.6 ms. Meaning to get a game targeted at 30 FPS on one machine, running at 60 FPS on another machine, you need double the performance, ignoring optimization and talking raw numbers.

Doubling your resource budget (again, we don't know specifics or CPU yet, but talking roughly here), means that any game that currently runs at 30 FPS, will be capable of running at 60 FPS with little to no work required. Almost as if this was done intentionally... like some sort of systems architect thought these things through. Granted this doesn't solve other problems like game systems being tied to framerate, but we are talking generally here.

It is true that a developer could be foolish enough to say FUCK IT we are using our entire resource budget, we're targeting 30 FPS on PS4K. Screw the PS4! We miiight get 20 FPS out of it if we really try, but who gives a shit WE HAVE MORE POWER MUWAHAHA. And also true this will become a thing at some point. I would estimate 3+ years after the PS4K launches when the install base is enough that it would make sense to do so. Around 6+ years after the PS4 launched. Just my own speculation of when these time tables will come into play, of course. PS4K could sell 100 gorillion units in the first year and everything I've said goes out the window... buuut probably not.

The only developer that would target 30 FPS on PS4K at launch or shortly after would be insane. It has an install base of 0, it is just a poor business decision. First party specifically trying to push the PS4K with exclusive content could be a thing, sure. I would suspect that would be very limited. Naughty Dog, Santa Monica, Guerrilla, etc.. are not going to ignore 35 million PS4's out there, they just aren't.

99% of your video games are just going to run better on PS4K. 99% will run exactly as they did on the PS4 if the PS4K never existed for the standard lifespan of a console. You are concerning yourself with fraction of a possibility.
Given you are correct and the PS4 can support 900P @ 60Hz which is then frame doubled to 120 FPS can the PS4 generate the second view from depth information using accelerators rather than the GPU?

Sony will have a newer version of the PS4 VR goggles with 1080P and 4G wireless that will need a more powerful PS4. Sony already purchased the company that has patents and radio designs so it's less than two years off and this is I think the PS4K target.
 
Doubling your resource bud
It is true that a developer could be foolish enough to say FUCK IT we are using our entire resource budget, we're targeting 30 FPS on PS4K. Screw the PS4! We miiight get 20 FPS out of it if we really try, but who gives a shit WE HAVE MORE POWER MUWAHAHA. And also true this will become a thing at some point. I would estimate 3+ years after the PS4K launches when the install base is enough that it would make sense to do so. Around 6+ years after the PS4 launched.

Pretty much my way of thinking. If this is the case than I would imagine this to be about the time of PS5's launch (3-4 years away). If PS5 is based on the exact same architecture again - same x86/Radeon APU (albeit 2x faster than PS4.5), Same API's (albeit updated) - then I'd imagine a similar scenerio. Games will look better/run faster on PS5 but still play fine one PS4.5 until around 3yrs after PS5's launch (6yrs+ for PS4.5), funnily enough when hopefully a PS5.5 will provide the next expected speed boost. And so on.


99% of your video games are just going to run better on PS4K. 99% will run exactly as they did on the PS4 if the PS4K never existed for the standard lifespan of a console. You are concerning yourself with fraction of a possibility.

Pretty much wait it will hopefully boil down to.
 

onQ123

Member
You really don't seem to understand what you're talking about. This is not the first time it seems like you read a buzz word and latched on to it without fully understanding what you're reading. I ask again, tell me the difference between native rendering, up rendering and upscaling. All you did was say it's up-rendered without explaining what the difference is between these three techniques. If you can't explain the difference on a basic technical level, how can you even be sure you understand what they mean? At least a few of us here have asked you to define the three things and how they differ. For starters games in itself aren't a native resolution. They are set to render at a native resolution. Just like PC games aren't defined by some set resolution.

You seem to be the person who don't understand because I just explained it simply to you.


You know that console games have set rendering resolutions & that if a emulator or more powerful console render it at a higher resolution that it's not the same as it's set resolution, It's being up-rendered.

You also should know that if a console game have a set rendering resolution & that if a emulator or more powerful console render it at the same resolution but scale it to fit the higher resolution screen, You up scaled it.


Lastly you should also know that if you are playing a console game & it has a lower resolution than the output of the emulator or more powerful console it will either be a smaller box in the middle of the screen or it will lower the resolution of the screen to fit the game,That is the native resolution.


You say that you're a game developer so why are you playing dumb?
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Why not? CPU guys should be anyone's by others with "x86 or bust" just as much as engine guys should be by "UE4 or bust" arguments ;).
Oh, I'm all for people's well-thought-out arguments, even if that's 'x86 ot bust'. Unfortunately, we both know the prevaling lot of those statements come from sheer lack of cognitive effort.

You really do monitor every thread so that you can be there to defend Wii U's honor at every chance. Crazy you're still at it after all these years.
Dear sir, you largely overestimate my ability to monitor gaf for blatant lowbrow-isms. I was merely passing by.

It really is hilarious, you have developers literally saying how weak the cpu compared to 360/ps3, then blue comes out with some test he did at home or some theory or speculation if why it's as powerful or even more powerful, when developers with NDA comment on how weak the cpu compared to lastgen, then you know there is a big problem.
Developers with NDA? LOL. Let me introduce you to a concept in science known as 'experiment reproducibility'. All I've done has been done in public, from source code, to assembly output.

All it takes is a linux-enabled/jailbroken ps3 and a jailbroken wii/wiiU to see how the CPUs in those stack against each other in whatever workload. Common folk have been measuring the performance of all gen 7 CPUs in various workloads (e.g. http://7-cpu.com/ - pay attention how for that test the PE has the effective performance of a dual core Cortext A7 @ 1GHz). Then again, it's much more convenient to regurgitate anonymous (dis)information, particularly when one has zero understanding of the subject, and the only way they could ride their soapbox is by references to anonynous authority, whether that has an ounce of credibility or not. How ninjablade of you.
 
I'm going to take the NX leaks with a grain of salt. I recall people were saying how much more powerful the WiiU was over the PS360 and while it had more ram and a slightly better gpu, its cpu was laughable compared to the Cell and Xenon.

Which explained why some of the PS360 ports to WiiU had some odd issues.

Well, the WiiU had an own community here with the only purpose in life to talk the WiiU over the 1TF line. The same with thing with the NX.
 

Hexa

Member
Only difference between native rendering and up-rendering is up-rendering is done by taking something that is trying to render at a certain resolution and translating to a higher screen space resolution at some point in the rendering process while the program still thinks it is rendering in the original resolution.

I don't know what you mean here because what you're describing sounds like upscaling, but in emulation when you increase internal resolution the program didn't "think" is its rendering at the lower resolution, it think it's rendering at the resolution it's rendering at. Hence it is native rendering at that resolution.
 

Will0827

Member
So now that we've had a couple weeks to ponder over these potentials, has a there been a shift in general consensus or are we still going to burn sony. i hope a white model is released so i can stay with the same color, although after having the white PS4 looking at the black one it is looking better.
 
So now that we've had a couple weeks to ponder over these potentials, has a there been a shift in general consensus or are we still going to burn sony. i hope a white model is released so i can stay with the same color, although after having the white PS4 looking at the black one it is looking better.

I'm ready for the siege this weekend at HQ.
 

Hexa

Member
You seem to be the person who don't understand because I just explained it simply to you.


You know that console games have set rendering resolutions & that if a emulator or more powerful console render it at a higher resolution that it's not the same as it's set resolution, It's being up-rendered.

You also should know that if a console game have a set rendering resolution & that if a emulator or more powerful console render it at the same resolution but scale it to fit the higher resolution screen, You up scaled it.


Lastly you should also know that if you are playing a console game & it has a lower resolution than the output of the emulator or more powerful console it will either be a smaller box in the middle of the screen or it will lower the resolution of the screen to fit the game,That is the native resolution.


You say that you're a game developer so why are you playing dumb?

The things though is that for the vast majority of games the native res is just what you're rendering at. It doesn't have any other technical meaning. If you want to use that term as what it runs on a console and describe uprendering as running above that that's fine. But at a technical level its exactly the same as increasing what native rendering resolution is generally used to describe.
 
Given you are correct and the PS4 can support 900P @ 60Hz which is then frame doubled to 120 FPS can the PS4 generate the second view from depth information using accelerators rather than the GPU?

I'll be honest I haven't followed PSVR tech closely, for example what exactly the breakout box brings to the equation. I will say one thing that is often overlooked when it comes from the jump from 30 -> 60 -> 90 FPS, with 90 being what I would suspect gets targeted more often on consoles over 120, but who knows we'll see. The jump from 30 FPS to 60 FPS is not the same as the jump from 60 to 90, I know that sounds incredibly unintuitive for those who haven't run the numbers so let me lay it out.

30 FPS = 33.3ms per frame
60 FPS = 16.6ms per frame
90 FPS = 11.1ms per frame

Going from 30 to 60 means you need to save 16 ms per frame. Either through intense optimization, throwing more power at the problem, or a combination of both.

Going from 60 to 90 means you have to save 5 ms per frame. Not an easy feat when you're likely already trying to push as hard as you can, but it isn't as big of a leap as people think intuitively because it feels like 30 frames are 30 frames regardless, but that isn't true.
 

clintar

Member
I don't know what you mean here because what you're describing sounds like upscaling, but in emulation when you increase internal resolution the program didn't "think" is its rendering at the lower resolution, it think it's rendering at the resolution it's rendering at. Hence it is native rendering at that resolution.
It's not taking the final result and upscaling. It's interrupting the rendering process and forcing a change in where in screen space the 3d element would normally have rendered.

Edit: I shouldn't say interrupt. I mean intercept.
 

warheat

Member
You know that console games have set rendering resolutions & that if a emulator or more powerful console render it at a higher resolution that it's not the same as it's set resolution, It's being up-rendered.

So going with this logic, and with you saying "up-rendered it to 4K" in your previous post.

Let's say the OG PS4 will render certain games at native 1080p and the PS4K up-rendered it to 4k.

Are you saying that "up-rendered it to 4k" means that not only the PS4K will output a 4K signal but it will also render internal resolution at 4K?
 

Hexa

Member
It's not taking the final result and upscaling. It's interrupting the rendering process and forcing a change in where in screen space the 3d element would normally have rendered.

Edit: I shouldn't say interrupt. I mean intercept.

That's not how increasing resolution in emulation works (at least on pcsx2 and dolphin but I would assume in general). The interrupt or intercept or whatever is done before rendering begins. So ultimately it's equivalent to just rendering at a higher native resolution.
 
This thread has turned into an internet fight over what constitutes upscaling vs uprendering, which means the shock is finally wearing off and people are moving on.
 
I don't know what you mean here because what you're describing sounds like upscaling, but in emulation when you increase internal resolution the program didn't "think" is its rendering at the lower resolution, it think it's rendering at the resolution it's rendering at. Hence it is native rendering at that resolution.
Some old games have the UI aligned to specific pixels of the screen so if you just increase the resolution the UI ends in a corner or with smaller size (this is still very common in PC games). To avoid that, the way that emulators increase the resolution is transparent to the game code. Still, rumours say Sony won't try to do something like that for old games.
 

Hexa

Member
Some old games have the UI aligned to specific pixels of the screen so if you just increase the resolution the UI ends in a corner or with smaller size (this is still very common in PC games). To avoid that, the way that emulators increase the resolution is transparent to the game code. Still, rumours say Sony won't try to do something like that for old games.

Ok. I think I get what you're getting at. I was only thinking about the 3D rendering. My bad.
 
I'll be honest I haven't followed PSVR tech closely, for example what exactly the breakout box brings to the equation. I will say one thing that is often overlooked when it comes from the jump from 30 -> 60 -> 90 FPS, with 90 being what I would suspect gets targeted more often on consoles over 120, but who knows we'll see.

Naive question from a non-developer who also hasn't been following VR - but, wouldn't you be rendering 2 displays at 60FPS in parallel, rather than one display at 120FPS in series, thus still having your 16.6ms not 8.3ms?

On the FPS front, brute force accepted, unless there is a good reason why high FPS is needed (I'm thinking motion sickness from first-person or driving games), I tend to prefer 30FPS and wish devs would lock their game there to stop sub-30 stutters. The more time the CPU has got for AI, NPC's, Physics, and any number of other world building goodness the better for me.

I'd be happy with 900p30 on PS4 and 1080p30 on PS4.5 with more AI, NPC's, Physics, etc - but accept that's would mean a lot more work for devs to build those extra's in rather than use the extra oomph to simply double the frame-rate.
 
Naive question from a non-developer who also hasn't been following VR - but, wouldn't you be rendering 2 displays at 60FPS in parallel, rather than one display at 120FPS in series, thus still having your 16.6ms not 8.3ms?

On the FPS front, brute force accepted, unless there is a good reason why high FPS is needed (I'm thinking motion sickness from first-person or driving games), I tend to prefer 30FPS and wish devs would lock their game there to stop sub-30 stutters. The more time the CPU has got for AI, NPC's, Physics, and any number of other world building goodness the better for me.

I'd be happy with 900p30 on PS4 and 1080p30 on PS4.5 with more AI, NPC's, Physics, etc - but accept that's would mean a lot more work for devs to build those extra's in rather than use the extra oomph to simply double the frame-rate.

Good point, I am not sure if it is an approximation of combining framerate per eye. If anyone has any technically oriented write ups they could link to I'd love to catch up.

But to follow that idea through, I would suspect 45 FPS per eye would be the target, assuming there isn't an issue with the refresh rate on the displays or similar problems. I could be way off base, of course, need to wrap my head around the underlying tech better.
 
You seem to be the person who don't understand because I just explained it simply to you.


You know that console games have set rendering resolutions & that if a emulator or more powerful console render it at a higher resolution that it's not the same as it's set resolution, It's being up-rendered.

You also should know that if a console game have a set rendering resolution & that if a emulator or more powerful console render it at the same resolution but scale it to fit the higher resolution screen, You up scaled it.


Lastly you should also know that if you are playing a console game & it has a lower resolution than the output of the emulator or more powerful console it will either be a smaller box in the middle of the screen or it will lower the resolution of the screen to fit the game,That is the native resolution.


You say that you're a game developer so why are you playing dumb?

Yes, consoles can set the rendering to be many resolutions, but whatever it draws the geometry at will be the native rendering. Simply intercepting the render call to and saying render at 4k instead of 2k is still rendering it at native resolution of 4K. The image you posted showed pixel sampling to interpolate what pixels should be drawn which is not natively rendering it. Native rendering wouldn't need to sample the pixels to render it at that resolution. You would only be sampling pixels for say AA or some effects, not for simply changing the resolution. Once you start sampling pixels to interpolate, that now has shifted from native rendering to upscaling. For example, many games on the Xbox One render natively at 900p and then upscale to 1080p for their final output.

Here's another way to ask it, is rendering natively under your definition and up-rendering under your definition the same or is one more expensive than the other?

I don't know what you mean here because what you're describing sounds like upscaling, but in emulation when you increase internal resolution the program didn't "think" is its rendering at the lower resolution, it think it's rendering at the resolution it's rendering at. Hence it is native rendering at that resolution.

That's not how increasing resolution in emulation works (at least on pcsx2 and dolphin but I would assume in general). The interrupt or intercept or whatever is done before rendering begins. So ultimately it's equivalent to just rendering at a higher native resolution.

Both of these is what I'm getting at.

Some old games have the UI aligned to specific pixels of the screen so if you just increase the resolution the UI ends in a corner or with smaller size (this is still very common in PC games). To avoid that, the way that emulators increase the resolution is transparent to the game code. Still, rumours say Sony won't try to do something like that for old games.

Well, there are different types of draw calls. What you're describing seems to be 2D rendering and not 3D rendering. You can handle the two different types differently before compositing them together. One would be scaling, the other one would be rendering it natively.
 

geordiemp

Member
Naive question from a non-developer who also hasn't been following VR - but, wouldn't you be rendering 2 displays at 60FPS in parallel, rather than one display at 120FPS in series, thus still having your 16.6ms not 8.3ms?

On the FPS front, brute force accepted, unless there is a good reason why high FPS is needed (I'm thinking motion sickness from first-person or driving games), I tend to prefer 30FPS and wish devs would lock their game there to stop sub-30 stutters. The more time the CPU has got for AI, NPC's, Physics, and any number of other world building goodness the better for me.

I'd be happy with 900p30 on PS4 and 1080p30 on PS4.5 with more AI, NPC's, Physics, etc - but accept that's would mean a lot more work for devs to build those extra's in rather than use the extra oomph to simply double the frame-rate.

Just what I read, but Ps4 must render at 60 FPS for PSVR, they have a tech that doubles frame rate headset effect to 120 Fps. There is one display 1080p thats split.

Anyway, a solid 60 FPS means simpler PSVR games for Ps4, no mans sky if it goes VR Ps4 has not a chance of a stable 60.

PS4.5 could enable games that are not possible on Ps4 at steady 60 FPS, it would need high bandwidth and really fast CPU. I can imagine many big games will struggle with a VR option on Ps4 as they are often 30 FPS.

Breakout box takes the warped image and turns it back into a 60 FPS normal for TV.
 

clintar

Member
That's not how increasing resolution in emulation works (at least on pcsx2 and dolphin but I would assume in general). The interrupt or intercept or whatever is done before rendering begins. So ultimately it's equivalent to just rendering at a higher native resolution.
Huh? How does that happen before? You mean you set it before? The emulated program still runs the same as it originally did, attempting to run at it's original resolution. The emulation layer just does the translation to allow the up-rendering.
 
On the FPS front, brute force accepted, unless there is a good reason why high FPS is needed (I'm thinking motion sickness from first-person or driving games), I tend to prefer 30FPS and wish devs would lock their game there to stop sub-30 stutters. The more time the CPU has got for AI, NPC's, Physics, and any number of other world building goodness the better for me.

I'd be happy with 900p30 on PS4 and 1080p30 on PS4.5 with more AI, NPC's, Physics, etc - but accept that's would mean a lot more work for devs to build those extra's in rather than use the extra oomph to simply double the frame-rate.

You're under a false assumption here of where the bottleneck is. Framerate is often GPU bound and not CPU bound. The AI, NPCs, and depending how they do the physics might be fine running on the CPU and the CPU is just waiting around for the GPU to finish. So dropping the framerate doesn't automatically mean you'll get better AI, NPCs, Physics, etc.
 
All it takes is a linux-enabled/jailbroken ps3 and a jailbroken wii/wiiU to see how the CPUs in those stack against each other in whatever workload. Common folk have been measuring the performance of all gen 7 CPUs in various workloads (e.g. http://7-cpu.com/ - pay attention how for that test the PE has the effective performance of a dual core Cortext A7 @ 1GHz). Then again, it's much more convenient to regurgitate anonymous (dis)information, particularly when one has zero understanding of the subject, and the only way they could ride their soapbox is by references to anonynous authority, whether that has an ounce of credibility or not. How ninjablade of you.
Interesting link.

So, 7 Jaguar cores @ 1.6 GHz are a lot faster than the single core, dual threaded Cell PPE @ 3.2 GHz, right?

I'm talking about MIPS, not flops. SPUs are obsolete in the GPGPU era.

There's a reason Uncharted 4 has way more intricate/advanced AI than the previous ones. ;)

Hopefully this will stop the "Cell is better than Jaguar" nonsense... there's no way you could run Uncharted 4 AI on the Cell PPE.
 

Hexa

Member
Huh? How does that happen before? You mean you set it before? The emulated program still runs the same as it originally did, attempting to run at it's original resolution. The emulation layer just does the translation to allow the up-rendering.

I meant for just 3D rendering. My bad. The resolution for that is set before it starts rendering and is equivalent to rendering at a higher native resolution with nothing to do with the original resolution. I don't know much about the rest so I'll take your word for it.

So upresing is rendering the 3D portions at a higher native resolution, having everything else in the code render at a lower resolution and then upscaling that before frame composition? How does that work with post processing effects?

Your use of the word translation is confusing me though, because that just means moving, not changing the scale or anything.

You're under a false assumption here of where the bottleneck is. Framerate is often GPU bound and not CPU bound. The AI, NPCs, and depending how they do the physics might be fine running on the CPU and the CPU is just waiting around for the GPU to finish. So dropping the framerate doesn't automatically mean you'll get better AI, NPCs, Physics, etc.

The PS4's CPU is a lot weaker than it's GPU so it's a lot more likely to be bottlenecked by CPU than GPU. That's also what developers have been saying.
http://hothardware.com/news/ubisoft...e-a-limiting-factor-in-assassins-creed-unity-
 

DeepEnigma

Gold Member
Let me take another crack at why this isn't much of a concern for me. Granted we are operating on a limited information set, and I will try and put things in as simple terms as I can, and will probably fail.

A 30 FPS game has literally twice the amount of time to render a frame as 60 FPS. It doesn't matter how powerful the machine you are using, you can do more in 33.3 ms than you can in 16.6 ms. Meaning to get a game targeted at 30 FPS on one machine, running at 60 FPS on another machine, you need double the performance, ignoring optimization and talking raw numbers.

Doubling your resource budget (again, we don't know specifics or CPU yet, but talking roughly here), means that any game that currently runs at 30 FPS, will be capable of running at 60 FPS with little to no work required. Almost as if this was done intentionally... like some sort of systems architect thought these things through. Granted this doesn't solve other problems like game systems being tied to framerate, but we are talking generally here.

It is true that a developer could be foolish enough to say FUCK IT we are using our entire resource budget, we're targeting 30 FPS on PS4K. Screw the PS4! We miiight get 20 FPS out of it if we really try, but who gives a shit WE HAVE MORE POWER MUWAHAHA. And also true this will become a thing at some point. I would estimate 3+ years after the PS4K launches when the install base is enough that it would make sense to do so. Around 6+ years after the PS4 launched. Just my own speculation of when these time tables will come into play, of course. PS4K could sell 100 gorillion units in the first year and everything I've said goes out the window... buuut probably not.

The only developer that would target 30 FPS on PS4K at launch or shortly after would be insane. It has an install base of 0, it is just a poor business decision. First party specifically trying to push the PS4K with exclusive content could be a thing, sure. I would suspect that would be very limited. Naughty Dog, Santa Monica, Guerrilla, etc.. are not going to ignore 35 million PS4's out there, they just aren't.

99% of your video games are just going to run better on PS4K. 99% will run exactly as they did on the PS4 if the PS4K never existed for the standard lifespan of a console. You are concerning yourself with fraction of a possibility.

This.

This thread has turned into an internet fight over what constitutes upscaling vs uprendering, which means the shock is finally wearing off and people are moving on.

Lol, so true. Test phase one complete?
 
The PS4's CPU is a lot weaker than it's GPU so it's a lot more likely to be bottlenecked by CPU than GPU. That's also what developers have been saying.
http://hothardware.com/news/ubisoft...e-a-limiting-factor-in-assassins-creed-unity-

You're right that the CPU is a bit weak. I guess what I was trying to get at wasn't clear. I was just trying to say that often things are GPU bound and not CPU bound in performance but this also is very game dependent too. So simply lowering the framerate doesn't always mean it's holding those things back. I can definitely see in a game like Assassin's Creed where they're doing things like complex crowd interaction how that might be pushing the CPU on the PS4. However, not everything needs to be updated at the same framerate as the rendering.
 
The PS4's CPU is a lot weaker than it's GPU so it's a lot more likely to be bottlenecked by CPU than GPU. That's also what developers have been saying.
http://hothardware.com/news/ubisoft...e-a-limiting-factor-in-assassins-creed-unity-
I call BS.

The 900p cap on both consoles has nothing to do with the CPU.

Also, Unity is coded in high level API (DX11, GNMX) instead of a low level one (DX12, GNM) that allows for you to have more draw calls (lots of NPCs) with less CPU usage and better multithreading.

Ubisoft is full of shit.
 

blu

Wants the largest console games publisher to avoid Nintendo's platforms.
Interesting link.

So, 7 Jaguar cores @ 1.6 GHz are a lot faster than the single core, dual threaded Cell PPE @ 3.2 GHz, right?
In multi-threaded MIPS - by far. Sony are not idiots ; )

Hopefully this will stop the "Cell is better than Jaguar" nonsense... there's no way you could run Uncharted 4 AI on the Cell PPE.
Dunno about that. The only thing that could stop people from posting ridicuous statements on message boards is thinking/researching before posting, and it's always easier to be ignorant. Blessed are the ignorants, or something.
 
You're under a false assumption here of where the bottleneck is. Framerate is often GPU bound and not CPU bound. The AI, NPCs, and depending how they do the physics might be fine running on the CPU and the CPU is just waiting around for the GPU to finish. So dropping the framerate doesn't automatically mean you'll get better AI, NPCs, Physics, etc.

Interesting to know. Must admit I wasn't thinking about bottlenecks, just logical thinking. 16ms seems like no time at all. But then I haven't thought about how many instructions a CPU is going to blitz through in that time either, so I suppose it's hard being a long time lapsed programmer not having a meaningful frame of reference. My thought process was simply - regardless of what the GPU is up to, if you can give the CPU extra time to do some work between frames that's got a be a good thing?
 

onQ123

Member
So going with this logic, and with you saying "up-rendered it to 4K" in your previous post.

Let's say the OG PS4 will render certain games at native 1080p and the PS4K up-rendered it to 4k.

Are you saying that "up-rendered it to 4k" means that not only the PS4K will output a 4K signal but it will also render internal resolution at 4K?

When the pixel counters go to count the pixels it will be 8294400 pixels & not 2073600 pixels scaled across 8294400 pixels.
 

onQ123

Member
So I ask again, what's the performance difference between native rendering 4K and up-rendering 4K?

I guess you will have to wait & see what Sony is using to achieve this & what corners they cut to get to 4K in a small console that's under $500.

Just know that if the 4K was up-scaled we wouldn't even be hearing about it because that can already be done with your TV.
 
I guess you will have to wait & see what Sony is using to achieve this & what corners they cut to get to 4K in a small console that's under $500.

Just know that if the 4K was up-scaled we wouldn't even be hearing about it because that can already be done with your TV.

So you don't know what you're talking about is what you're saying and you have no idea what the real difference is between native rendering and up-rendering. You just latched on to a buzz word without any real understanding of what it really means in the end.
 

Caayn

Member
I guess you will have to wait & see what Sony is using to achieve this & what corners they cut to get to 4K in a small console that's under $500.

Just know that if the 4K was up-scaled we wouldn't even be hearing about it because that can already be done with your TV.
So you're basing this "up-rendering" on the feeling that you wouldn't hear about 4K if it wasn't upscaled? Thanks for proofing that you took a buzzword and ran with it without knowing what it actual meant.

For all we know the PS4K codename (if that's the real codename) just refers to its ability to output 4K video (ie have a HDMI 2.0 port, UHD Blu-ray drive and the necessary hardware to hardware decode the required codecs).

Codenames tell you nothing about the workings off the end product that we as the consumer will see.
 

The God

Member
Co-founder of Bioware, Greg Zeschuk, had some things to say about this

"I'd say that'd be a gigantic pain in the ass that flies in the face of the purpose of consoles," he said. "It's funny, there's actually some stories behind that. For example, the original Xbox...Microsoft actually had multiple different DVD drives. They didn't tell anyone that, but as a developer you discovered that you have different performance and sometimes you'd have these boxes of refurbished drives and different brands and different equipment. It caused incredible variability."
Zeschuk went on to say the benefit of having locked system specs as consoles currently do is that it's clear to developers what they are working with.

"The whole purpose of consoles is the set of requirements that you work against from a hardware perspective," he said. "To change that is complete lunacy."
"I just think it's bad," he said. "I think, 'lock it' and let developers do their thing. But at the end of the day, if you can focus your development effort on one set of hardware requirements and target, you are going to get a better result. It's easier than having to split it, adding more people, having to port things across."

"It's like dipping your toe back into the PC pool where you have to consider all these things. It was nice on console not having to consider like performance sliders. But it's just crazy. I guess maybe [Microsoft and Sony] feel the need to."

From a consumer perspective, if Microsoft or Sony were to release a new console in the middle of an existing cycle, that would be "really irritating," Zeschuk said. While Apple may be able to get millions of people to buy new phones every year, and mid-cycle hardware upgrades already have a history at Nintendo with the DS, console upgrades on a short interval may be a long shot.

"I don't think they can pull an Apple and get you to upgrade mid-cycle."

more at the link http://www.gamespot.com/articles/bioware-founder-on-ps4xbox-one-upgrades-itd-be-a-g/1100-6438664/
 

onQ123

Member
So you don't know what you're talking about is what you're saying and you have no idea what the real difference is between native rendering and up-rendering. You just latched on to a buzz word without any real understanding of what it really means in the end.

I explained to you what it mean & it's not a buzz word. Why would it even be a buzz word? why would someone want to claim up-rendering when they could claim that the game is native 4K?
 
I explained to you what it mean & it's not a buzz word. Why would it even be a buzz word? why would someone want to claim up-rendering when they could claim that the game is native 4K?

You explained that they simply just render it at a higher resolution which doesn't explain any difference between a native rendering at 4K and an up-rendering at 4K. You can't even explain how this affects performance between the two techniques. So yes, you ran with a buzz word without any understanding what it actually means nor how it works, what the costs let alone what the trade offs are. You're simply saying it just is, without telling us why it is. Now your only justification to defend it is simply because why would someone say up-rendered 4K?
 

geordiemp

Member
Sounds like a posh way of saying upscaling.

Sure there are a few ways to upscale more effectively, but it is what it is, and I would think not every upscale algorithm would be best for different genres (3D racing game vs 2D game vs etc...you get the idea)
 

Tyl3n0L85

Neo Member
That up-rendering debate is funny to watch and interesting. I don't know nothing about it however it would be nice to have someone knowledgeable coming here and properly explaining the differences between native, upscaling and up-rendering because I've never heard about up-rendering before and it seems this discussion isn't going anywhere.
 

geordiemp

Member
Co-founder of Bioware, Greg Zeschuk, had some things to say about this

more at the link http://www.gamespot.com/articles/bioware-founder-on-ps4xbox-one-upgrades-itd-be-a-g/1100-6438664/

It depends what the upgrade is to whether it will fly. If its a 14 nm APU with

Zen = double power CPU
GDDR5x = double bandwidth
GPU = double cores

Then if its optimised well at 14 nm then it should be good for 60 FPS AAA gaming - I am sure that would sell well, developers allowing the option of course. As many PC versions get 60, I cant see why not.

If they half ass it or leave anyone of the upgrades on the table, then there is no point imo as that will be the bottleneck.

Bioware last game, Dragon age, had allot of dips at 30 FPS using Frostbite. At 60 this would be sweet.
 

Hexa

Member
That up-rendering debate is funny to watch and interesting. I don't know nothing about it however it would be nice to have someone knowledgeable coming here and properly explaining the differences between native, upscaling and up-rendering because I've never heard about up-rendering before and it seems this discussion isn't going anywhere.

Yeah. I'm just all around confused, because the image on the previous page is just upscaling and that's what it started out as. But now we're talking about rendering at a higher resolution with some hacks. WTF lol.
 
Co-founder of Bioware, Greg Zeschuk, had some things to say about this

more at the link http://www.gamespot.com/articles/bioware-founder-on-ps4xbox-one-upgrades-itd-be-a-g/1100-6438664/

With all respect to the doc, Microsoft using different optical drives with different performance and not telling anyone is the exact opposite of explicitly using hardware with different performance and also telling everyone, particularly developers about it.

The rest of what he addresses we have already discussed ad nauseum, but I respectfully disagree that it is as difficult an engineering problem as he suggests. Releasing on a completely different platform, even with similar specs, is much more of a pain in the ass. Hasn't stopped games coming out on Xbox One.


edit: it isn't painless, but come on. On a 10 point pain scale we are talking like a 2 tops. And that is if your engine is a goddamn mess.
We gotta use doctor lingo now, you know, show some respect.
 
Great comments and backs up what some of us have been saying. Maybe it'll actually sink in for those who think it'll be painless for devs.

Nah, clearly anyone who isn't positive towards the idea is incorrect and doesn't have any validity. All you do is configure an ini file and you're done!

Yeah. I'm just all around confused, because the image on the previous page is just upscaling and that's what it started out as. But now we're talking about rendering at a higher resolution with some hacks. WTF lol.

Ya, and he can't explain in any detail about it past the 20 foot pole explanation. The worst part is like you said, the image was showing a form of what looks like upscaling because it's sampling pixels to interpolate how to draw new pixels.
 
Top Bottom