• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS4K information (~2x GPU power w/ clock+, new CPU, price, tent. Q1 2017)

Panajev2001a

GAF's Pleasant Genius
3-5 times the TDP? I just posted in another thread but a small form factor PC for just over $440 capable of out performing the PS4 on nearly a 2:1 ratio only has a power requirement of around 220w. The PS4 afaik gets up to ~150w during play.

Ok, so a recent small factor PC costing almost $100 more and consuming at least 70 Watts more can in some case outperform an almost 3 year old hardware on a likely outdated by comparison manufacturing process... and what is your point exactly? That PS4 is short changing users because?
 

onQ123

Member
Just got through looking over the patent of "uprendering". I need to point out a few things to you starting with some basic clarification.

1. Uprendering is a process where a original image source (say 1080p) is rendered 4 times with subpixel shifts, then these 4 images are combined to create a single higher resolution output (say 4k).

2. The reason this was implemented for Sony's PS4 emulation of PS2 games is because some PS2 games were coded in such a manner that they needed to retain a frame of the original source with the correct image size for post processing.

3. This process would be inherently more strenuous on memory and processing than rendering natively at 4k due to all five frames being held in memory during the creation of the final output frame and the processing itself to perform that task (rendering the same number of total pixels, performing subpixel shifts, and recombining them).

It's a great tool for Sony to use to render games at higher resolution requiring minimal patching in certain situations where it is required for the game to render properly, but it is more costly in terms of power required than simply changing the native rendering resolution.

I don't believe recent games would require this particular method for a patchless higher resolution render. Even so, it or some variation of it may be.

Bottom line: It's a more expensive way to increase the native rendering resolution, but could potentially provide a patchless method to do so in some potential cases where simply increasing the native rendering resolution may cause errors. It should be avoided if possible.

You don't have to point anything out to me & this is just a form of up-rendering it's not the only way it can be done.
 

Elandyll

Banned
Where are you getting this second GPU stuff from?
An insider hinted that the chosen solution was sli based, which implies a doubling of the current card, bridged.

Also makes sense as it implies the devs could choose to make a non ps4k optimized game (PS4 mode) or choose to use sli.

It also falls in line with insiders hinting that regular ps4 games would receive little to no upgrade afaik (beside the 4k scaling for 4k tvs)
 

big_z

Member
Where are you getting this second GPU stuff from?

92IU8Q9.jpg



It's a stacked gpu

are you misterxmedias sony cousin?
 
It also falls in line with insiders hinting that regular ps4 games would receive little to no upgrade afaik (beside the 4k scaling for 4k tvs)

Silly question from a clueless user - If a game is running at 900p on the PS4, and (forget second GPU's for a moment) a PS4.5 had a faster GPU, would it automagically start rendering at 1080p because the new GPU can handle it - or would the game need patching to render at the new resolution?

I thought, with the possible exception of frame rate improvements, it stood to reason a game wouldn't take advantage of the new hardware without patching? A games not gonna magically do something it's code isn't telling it to do.
 
I'm guessing 4k is the new thing, but its a fair point to make that pushing 4k for this unit is strange considering that games are def not going to be 4k a majority of the time if at all, even for direct up ports from PS4...i expect 1440p at the very most in regards to that kind of resolution increase, and maybe a performance upgrade if we're talking about a CPU upgrade as well

Still to be fair very few games on PS3 were 1080p, yet it was still the console of the 1080p era.

It's the trojan horse for Sony's 4K media platform as well.

4K media delivery is extremely uncommon at the moment, but it's also the future, very near future at that. If Sony wants to be a part of it, a big part of it, they should and will start now.
 

QaaQer

Member
I think the most likely scenario is this.

There won't be a more powerful PS4K console. The next revision of PS4 will be able to play Ultra HD Blu-ray movies, but will have the exact same core specs as PS4. Will not split the market. Will not piss anyone off.

This is what I originally thought until Matt chimed in. He really is the only gaffer I have 100% confidence in. Although, I may be reading too much into what he actually wrote.

PS5 will arrive in Fall 2020 (using 10nm node for its APU) and will be a full generational leap over PS4. Compared with PS4's 8 core Jaguar CPU @ 1.6 GHz, PS5 will have 8-12 Zen+ or Zen 2 cores @ 2.8~3.2 GHz coupled with a GPU based on Navi (gpu architecture that comes after Polaris and Vega) and whatever AMD is calling "NextGen" memory, which I guess is something beyond the HBM2 standard (HBM3 or something else). 1.5 to 2 TeraByte/sec bandwidth

IDK if we are ever going to get a real generational leap again. According to The Economist, cost per transistor stopped going down in late 2012. So, going forward, using smaller nodes will not yield cost savings. Yes you can jam more transistors into each mm^2, but it will cost more than an equal number on a larger process. So expecting a a ps5 to be 18TF @ $399 in 2013 USD unsubsidized is not going to happen. This is also the reason those expecting the NX to be more powerful for less money are also going to be let down.

We will have better designed chips, better memory configurations, and better software; but this will not yield a cheap 18TF machine in the next 3-4 years.

We are in the era of stagnation.

2-4 TB HDD.

Do you have any reason to think non-ssd hdd prices will be coming down? It seems like they have been stuck since 2012.

A nice big palette of resources for for devs to build the experiences they want for the early and mid 2020s, be it 4K games, 1080p games or VR games.

If the mass market sweet spot (price performance) is 5-8TF from 2020 to 2030, is that enough for 4k games and substantial vr?
 
Do you have any reason to think non-ssd hdd prices will be coming down? It seems like they have been stuck since 2012.

Hm?

The sizes available are much better than since 2012, with the same prices are previously smaller ones.

Maybe since 2015 prices have not changed so much, but that's only a year ago.

I bought a 2 TB laptop size drive for 90 USD just a few months ago, no way such a price existed in 2012. 1 TB external drives in 2012 were going for $100. Now those are 50 and 60 bucks for 1 TB.
 
Ok, so a recent small factor PC costing almost $100 more and consuming at least 70 Watts more can in some case outperform an almost 3 year old hardware on a likely outdated by comparison manufacturing process... and what is your point exactly? That PS4 is short changing users because?

No you were the one to make a statement that wasn't entirely accurate, I was simply providing more information. Posts like yours only serve to cause more misunderstanding about PC's.
 

onQ123

Member
Silly question from a clueless user - If a game is running at 900p on the PS4, and (forget second GPU's for a moment) a PS4.5 had a faster GPU, would it automagically start rendering at 1080p because the new GPU can handle it - or would the game need patching to render at the new resolution?

I thought, with the possible exception of frame rate improvements, it stood to reason a game wouldn't take advantage of the new hardware without patching? A games not gonna magically do something it's code isn't telling it to do.

If the 2x2 up-rendering algorithm is system level it will probably take it to 3200 x 1800 & upscale it to 4K or down scale it to 1080p.
 

QaaQer

Member
Hm?

The sizes available are much better than since 2012, with the same prices are previously smaller ones.

Maybe since 2015 prices have not changed so much, but that's only a year ago.

I bought a 2 TB laptop size drive for 90 USD just a few months ago, no way such a price existed in 2012. 1 TB external drives in 2012 were going for $100. Now those are 50 and 60 bucks for 1 TB.

I only know Western Digital with full 5 year warranty and only desktop, and those prices haven't moved much at all, so I'm probably wrong to extend that to other drives.
 
I only know Western Digital with full 5 year warranty and only desktop, and those prices haven't moved much at all, so I'm probably wrong to extend that to other drives.

WD drives are 1 TB for 50 to 60 USD in the US. Portable drives though.

They also have many different brands of drive, higher performance and more eco-drives.
 
Most only have 2-3 year warranties, like I said. And I'm in Canada, so there is also that.

Here is a chart from http://www.mkomo.com/cost-per-gigabyte-update

hddpricesusd_zps1k4tbu9n.jpg


So they have come down, but much slower than is the historical norm.

I'm from Canada too, but I travel to US (bought my stuff there). You have to consider that our dollar dropping to 70 cents US from 1.10 from 2011-12, not coincidentally to the time you are comparing to.

So you're mixing up two phenomenon. Pricing has dropped considerably overall. So has CAD. That's why it doesn't apply to Canada.

Don't forget you're on an international forum. Location and currency means a lot for pricing.

Also consider why that slope looks like an S, it has to do with the type of chart you are looking at (logarithmic). Per gigabyte storage price does not look like that on a linear chart.

Sizes have increases full TB in size in the last few years, compared to only 10s of GB previously, so you should know that a linear chart would not look like this.

Per gigabyte prices on a linear chart are becoming exponentially cheaper every year, but such an exponential curve is not apparent on a logarithmic scale.
 

onQ123

Member
Osiris is just watching the thread burn.

Something is up, everyone with inside info is holding the new information back for some reason. they step in mention that there is new information then never post it.


it's as if someone is telling them not to let this information out.


I think it's an early release date
 

poodpick

Member
It would be great if the PS4K had a second HDMI output strictly for audio so that you could get lossless audio without upgrading your receiver.
 

THE:MILKMAN

Member
Something is up, everyone with inside info is holding the new information back for some reason. they step in mention that there is new information then never post it.


it's as if someone is telling them not to let this information out.


I think it's an early release date

The curious thing for me is the complete lack of other sources. With PS4 we had the controller/dev kit leak in January 13' way before any retail leaks and apart from that there were multiple sources/many insiders/journalists talking.

The NX controller leak really does make me wonder about all this.
 

Metfanant

Member
I don't think any of this is true. Yes, the original image posted shows an upscaling process. But if you actually read the patent, that's just a part of it. The important thing appears to be when that step is applied.

Unlike traditional upscaling, they're not (always) taking the final frame buffer and blowing it up. Instead they could be intercepting an initial or intermediate render target and upscaling that, then finishing the rendering process with further passes on this scaled target.

This is not exactly like upscaling, because the blowup isn't happening to the final frame. It's more similar to native rendering at a higher resolution, since there's a buffer at high res having effects applied. But it's not exactly that either, because the source of that high-res buffer is an upscale of a low-res buffer.

So it's a combination of upscaling and native rendering...hence "uprendering". It's not an elegant name, but neither does it seem meaningless like some folks believe.

something cannot be "exactly like upscaling" because there are eleventy different ways to upscale an image...this is one of them...period...advanced? yes...interesting? yes...a different process altogether? no...

You realize that things have been explained to them clear as day & they revert right back to acting as if I'm just making up words? this is a lost cause all the information is right there in their face. next year or so they will be acting like they are experts on the subject.

The post from a few pages back still apply.

you need to look in the mirror fella...its the other way around...

See my post from earlier? I think the second GPU or acceleration hardware is going to be used to push rendering to 4K.

second GPU...ok man...come on now...

It's not - the mathematical equivalent is accumulation AA, but instead of AA they resolve the samples into a higher-resolution image.
Of course, that also makes it mathematically equivalent to Supersampling (so sure, people comparing it to "render in higher resolution" aren't completely wrong), however there are two notable differences:
1) There are no ordered grid limitations like with supersampling - you can use non-regular sample distributions, which usually leads to visually better results than just resolution scaling
2) At "some" point in your render pipeline, processing is split to process the scene multiple times. It's pretty much a guarantee that it is more expensive than normal rendering in higher resolution would be (at best, you could hope the cost to be the same).
Ie. this is not a shortcut to higher-resolution rendering for cheaper - it's actually more expensive, but the tradeoff in case of emulation is higher-compatibility, and as mentioned likely higher quality for the same pixel count.


It's nothing like upscaling, see above.

this makes no sense...no sense at all...if its MORE taxing from a hardware standpoint, then why not just render in 4k to begin with??...

you CANNOT...i repeat CANNOT match the results of a native render with any sort of upscaling technique. it is absolutely impossible. You are using algorithms to approximate details that are not actually present in the image...

because this is the case, why on EARTH would you go with a a more taxing process when you could just render the game natively at 4k and have the REAL details present?

I didn't say they weren't, I'm just saying they don't need to. It's moreso fluff in my eyes. Don't give me a TV with a greater response time because your wasting time improving an imagine which scales perfectly.

Well, correct me if I'm wrong, but I can't think of any other game/thing which does the same as QB. KZ was similar but was interlaced, not the same.

I don't see how it's different. You could take the same single picture of the same frame and upscale it, or use a panarama to create the same image but stitched together of a higher resolution. I feel like it's the best analogy to use in this sense.

ok, i think youre confusing some things...there are lots of ways to make a panoramic image...but when youre talking about stitching images together...

here is a panoramic image that is "stitched" together..i used this one, because you can clearly see the breaks where the image is put together...yes, the resulting image would be of a higher (horizontal) resolution...but this isnt what we're talking about at all...
1200px-Downtown_Philadelphia_Pano_1913.jpg




THIS...is taking the same frame at two different resolutions (obviously exaggerated for effect)

srep06687-f1.jpg




Just got through looking over the patent of "uprendering". I need to point out a few things to you starting with some basic clarification.

1. Uprendering is a process where a original image source (say 1080p) is rendered 4 times with subpixel shifts, then these 4 images are combined to create a single higher resolution output (say 4k).

2. The reason this was implemented for Sony's PS4 emulation of PS2 games is because some PS2 games were coded in such a manner that they needed to retain a frame of the original source with the correct image size for post processing.

3. This process would be inherently more strenuous on memory and processing than rendering natively at 4k due to all five frames being held in memory during the creation of the final output frame and the processing itself to perform that task (rendering the same number of total pixels, performing subpixel shifts, and recombining them).

It's a great tool for Sony to use to render games at higher resolution requiring minimal patching in certain situations where it is required for the game to render properly, but it is more costly in terms of power required than simply changing the native rendering resolution.

I don't believe recent games would require this particular method for a patchless higher resolution render. Even so, it or some variation of it may be.

Bottom line: It's a more expensive way to increase the native rendering resolution, but could potentially provide a patchless method to do so in some potential cases where simply increasing the native rendering resolution may cause errors. It should be avoided if possible.

like i said before...if this is a more taxing process...it makes no sense for the PS4k...because its not even going to be able to render most games at 4k natively...so no way its got the horsepower to do something even more taxing...

If the 2x2 up-rendering algorithm is system level it will probably take it to 3200 x 1800 & upscale it to 4K or down scale it to 1080p.

the PS4k WILL NOT have the horsepower to do this...

Stayed out of here for a while now im trying to catch up and see if anyone else dropped the other pieces.
do tell....
 

onQ123

Member
The curious thing for me is the complete lack of other sources. With PS4 we had the controller/dev kit leak in January 13' way before any retail leaks and apart from that there were multiple sources/many insiders/journalists talking.

The NX controller leak really does make me wonder about all this.


I think because it's like a PS4 slim & already have games that can be played on the system they don't have to send out devkits to all the devs so early.
 

THE:MILKMAN

Member
I think because it's like a PS4 slim & already have games that can be played on the system they don't have to send out devkits to all the devs so early.

Sorry onQ, not following? The original Kotaku story stated new dev kits were in dev hands. If new kits were needed I'd think the changes are substantial.
 
You realize that things have been explained to them clear as day...
I don't agree. Your posts have been vague and handwavey, and furthermore I strongly doubt the technique described in the patent has anything to do with PS4K. It's specifically to address issues with upscaling old emulated games.

It's not - the mathematical equivalent is accumulation AA, but instead of AA they resolve the samples into a higher-resolution image.

...It's nothing like upscaling, see above.
The process as a whole is very different, yes, and more like native rendering. I said that in my post. But one of the steps is very much like upscaling. In your words: "they resolve the samples into a higher-resolution image."

I guess we could go with current trends and refer to this as "reprojection" rather than "upscaling", but I personally think that term may be overapplied. The accumulation methods used here and in Rainbow 6, Quantum Break, FF XV, etc. seem to differ fundamentally from what was used in Killzone: SF multiplayer. As I understand it, only that game uses motion vector prediction, and only that game includes natively rendered pixels in every frame. The results at least seem to be qualitatively sharper.

Wait a freaking moment. Don't many games already do it? How did Sony patent this?
Because it explicitly applies only to emulation.
 

THE:MILKMAN

Member
With only a few devs with the devkits on a console that might release in a few months?

Sure. If the current PS4 is still the base system and the PS4K is the "premium" model for say the first year. I could see only WWS and select big third parties getting/wanting new kits (EA, Activision, Ubisoft etc)

Also wasn't it said by those in the press they heard whispers of the PS4K for some time? So by the time this launches we could be talking about 18+ months these new kits being in first party dev hands.
 
Woah woah woah.

None of this has to do with reprojection. We should erase that term from the discussion of upscaling/up-rendering/native resolution.

Reprojection is just using the same frame two or more times successively to emulate the appearance of motion with VR head movement. That's not anywhere close to the discussion of resolution.

At least as far as I understand.
 

Metfanant

Member
@Metfanant OK "Smart Guy" tell me something:


How is it up scaling when the original pixels remain the same size?

It's actually quite simple really...

You start with an image..apply a set of algorithms..the result, is an image with a higher resolution than the original image...

Upscaling...

You're still dealing with a large amount of "fake" pixels..
 
Been reading the last 2-3 pages and people getting on onQ123, even though he used the correct word though.
But a picture is worth a thousand words:

Upscaled:
1460222996-upscale.png


Uprendered/Upresed:
1460222993-upres.png



Upscaled:
1460223005-upscale2.png


Uprendered/Upresed:
1460223001-upres2.png

Upscaling is basically taking the final output image and display it at your display resolution, which result in a quality loss. Upres/Uprendering is basically rendering the 3D scene/image at a higher resolution and then display the final output image as native res.
 

Metfanant

Member
Been reading the last 2-3 pages and people getting on onQ123, even though he used the correct word though.
But a picture is worth a thousand words:



Upscaling is basically taking the final output image and display it at your display resolution, which result in a quality loss. Upres/Uprendering is basically rendering the 3D scene/image at a higher resolution and then display the final output image as native res.

No...there is a million different ways to upscale an image...

If you're starting with a lower resolution image and creating a larger resolution image whether you call it upscaling or up-rendering, or upressing, there will ALWAYS be a loss of quality...

Unless you're moving between two resolutions that divide equally, and the screen size remains the same (ie pixel density quadruples in 1080p --> 4k)...

Then you can achieve identical quality, but none of the additional image details you would get from a native 4k image

There is a big difference between actually RENDERING something at a higher resolution (like you can do with PC games)...and taking a series of 1080p frames and smashing them together with a fancy algorithm and spitting out a fabricated 4k image
 
Top Bottom