• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Unreal Engine 4 PS4 vs. PC

Durante

Member
You have no idea what you're going on about, do you?
I have published papers at international conferences dealing with performance analysis on GPUs. But maybe the peer reviewers failed to recognize my total cluelessness.

Where do you think they get the initial flop ratings for their alus? Out of thin air?
No, they get them from how the hardware is designed. If one ALU is designed to retire N floating point operations per cycle, and there are M of them, then you get N*M theoretical FLOPs per cycle. It's not rocket science.

And windows overhead is at the thread/process level. You are bordering on cluelessness.
Please enlighten me as to the impact of this thread/process level overhead. I currently have a student investigating the overhead of thread and process level synchronization primitives on Windows, I'm sure he'd be thrilled to compare notes with you.
 

Triple U

Banned
well really you dont "code to the metal" it just a lot lower API in console. Coding on low level api and one spec together really make all the difference.


http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/ <-really the whole article is a good read

But John Carmack is really the one for years talk about how bad API in PC are...







http://www.pcper.com/reviews/Editor...-Graphics-Ray-Tracing-Voxels-and-more/Intervi

Just think about in PC gaming just a driver update can give you 30% performance improve in a game.

The 680 is more double the performance of the gpu in the ps4. Epic engine is also DX based while the ps4 api is not. I wouldnt be shock to see this demo run better on x720 hardware just base on that. Until epic has more time to port this engine over. Like epic said their engine are ready until they release a game on that platform.
Idk how many times these interviews have been posted but thank you for posting them again. I cringe every time someone writes coding to the metal off as a mere myth.
 
I want the new consoles to come out so we can all see that they're a huge improvement over current hardware despite not being as powerful as the best PC gaming hardware so that we can all get back to talking about games.

There is emotional investment on both sides

Yes but one side has reality to back it up.
 

KageMaru

Member
I'd really like to see what you've read.

There have been quotes and threads at B3D that have covered the topic. I've linked to them before and can track them down if I find the time while I'm at work.

You're free to believe what you want, but "coding to the metal" isn't what most think it is and it's not done as often as most think it is.

Thats patently and fundamently false. If its been mentioned so much, I'd love to see some posts to challenge. Im gonna just go with very few people here and elsewhere know exactly what "coding to the metal" is and even fewer understand the implications on performance.

Funny thing is the bolded applies to you very well. More often than not, actual coding to the metal is more time consuming than the performance gain is worth. Otherwise we would see entire PS360 games written entirely in assembly instead of DX or LibGCM on the 360 and PS3 respectively (not that that would really be feasible anyways).

Why is the PS4 getting all the blame for the removal of SVOGI and the overall degradation? I'm sure the rumored 1.2TF GPU in the Durango is going to match 1-to-1 with the PC demo.

I believe both systems were blamed, or at least that's the impression I got.
 

Bitmap Frogs

Mr. Community
It's not like UE4 is the only thing happening on PS4. Doesn't Enlighten Engine support dynamic radiosity on PS4?

PlayStation®4 promises a dramatic increase in memory and compute resource, which will free developers to unleash the full creative power of dynamic lighting and to finally extend cinematographic film practices to dynamic immersive worlds.

Sure it also offers the option to go for prebaked lightmaps (or a combo), but I definitely believe we will see some kind of realtime GI on PS4 (most likely in a form of prebaked/dynamic-in-certain-areas combo).

I just looked at the demo and it was sorta meh.
 

Triple U

Banned
I have published papers at international conferences dealing with performance analysis on GPUs. But maybe the peer reviewers failed to recognize my total cluelessness.

Maybe? IDK. You've proven nothing of your knowledge to me, so my assumptions are what they are *shrugs*

No, they get them from how the hardware is designed. If one ALU is designed to retire N floating point operations per cycle, and there are M of them, then you get N*M theoretical FLOPs per cycle. It's not rocket science.


Of course its how the hardware is designed. Again, it doesn't come out of the thin air. If you design an alu to perform , say a multiply-add ,do you think they just take a leap of faith and say "welp its designed that way, it should work, next" Or do you think they go thru an extensive testing regime to make sure this alu can do these three flops within the clock at an acceptable rate?

Please enlighten me as to the impact of this thread/process level overhead. I currently have a student investigating the overhead of thread and process level synchronization primitives on Windows, I'm sure he'd be thrilled to compare notes with you.

Send him my way :). We could discuss thread level parallesim and some of windows pitfalls in that area, scheduling deficiencies, or execution delays, or overall thread management. There's a whole world to explore.

kagemaru said:
Funny thing is the bolded applies to you very well. More often than not, actual coding to the metal is more time consuming than the performance gain is worth. Otherwise we would see entire PS360 games written entirely in assembly instead of DX or LibGCM on the 360 and PS3 respectively (not that that would really be feasible anyways).

Yeah, no. And as far as I know, only Sony's first party has did any meaningful work "to the metal" this generation. And they released alot of their work to share.
 

BibiMaghoo

Member
I have no idea why Epic's engines are so popular.

The price they ask for its use on an AAA game is astronomical, unless of course they have deals for publishers outside there standard pricing, which may be the case.

For those that don't know, they want 25% of everything you make over 50k.
On a game that sells 3 or 4 million copies, that is a HUGE amount of money.

On topic, I'm surpsrised it stacks up against it so well, not disapointed that it doesn't meet an expensive PC in performance.
 

fart town usa

Gold Member
Do the PC gamers who claim superiority over consoles *due to graphics* realize that PC's and consoles don't share the same software libraries?
 

RoboPlato

I'd be in the dick
Do we know how much faster Lightmass is going to be on UE4 than it was on UE3? I know it could take hours to bake in UE3 and Enlighten bakes in 20 minutes so I'm hoping it's substantially faster. The SVOGI solution would have really sped things up.
 

Drazgul

Member
Also, Bethesda will use id Tech 5 for his future titles.

Source? I mean sure they own it, but it might not be the best choice for the types of games they make (and I'd rather take their Creation Engine over Tech 5 if it means better modding support).
 

Stallion Free

Cock Encumbered
I have no idea why Epic's engines are so popular.

The price they ask for its use on an AAA game is astronomical, unless of course they have deals for publishers outside there standard pricing, which may be the case.

For those that don't know, they want 25% of everything you make over 50k.
On a game that sells 3 or 4 million copies, that is a HUGE amount of money.

On topic, I'm surpsrised it stacks up against it so well, not disapointed that it doesn't meet an expensive PC in performance.
The engine is well documented and many devs were trained it on it already.

And you are looking at their Indies licensing deal not the actual publisher level one.
 

BibiMaghoo

Member
The engine is well documented and many devs were trained it on it already.

And you are looking at their Indies licensing deal not the actual publisher level one.

Sure, I guess they get a better deal then.

Do you know how a publisher pays for it in comparison then? Is it a flat fee for a dev to use the engine? A per game basis?

I thought that 25% was pretty fucking disgusting to be honest, though I'm sure many will disagree.
 
Yeah, no. And as far as I know, only Sony's first party has did any meaningful work "to the metal" this generation. And they released alot of their work to share.
They had no choice. The PS3 is/was a development nightmare. Epic games and 343 did awesome work on the 360 late in it's life as well.

Anyways, the UE4 demo on PS4 may be unoptimized, but it also looks like complete ass compared to the PC one. This isn't knocking PS4... there's just no way it could measure up to it as it's not a fair comparison spec-wise. I'm guessing if what you were seeing was just unoptimized, it would probably look as good as the PC one but just be running a bit worse right now. There appear to have been major cutbacks though. There's just no getting around a 680 versus a mid-range (probably lower power) GPU. That's the limitation and there's not much more to say about it.

For me, UE4 was considerably better looking than all of the PS4 demos if we're talking about the about the 680-powered ones. Not like it's even going to matter though, because it's pretty likely the new Xbox will be the target anyways and will almost certainly be a little below PS4. You'll still get great looking games, but consoles are going to hold things back again over the years.
 
ole gamers would expect, and doesn't quite tally when other developers are talking about PS4 out-powering most PCs for years to come. But it is important to put all of this into context. The DirectX 11 API is very mature while the PS4 tools and APIs are still in their initial stages of development - it's going to take time for devs to fully get to grips with the new hardware

You heard it here first guys, that PC optimisation brings pc ahead of consoles.

As usual from digital foundry this entire article reads like a whole lot of pandering to the console only crowd.
Can't wait to see them test all their games on a 7850 on pc 4 years into next gen to have a 'fair' comparison, while the rest of us are playing at 120hz 1600p on gpus 6-10x better.
 

Swifty

Member
I'm fairly certain that UE4 engine will be used by even more devs for next gen than UE3 was/is used for current gen.

It is the de facto 3rd party game engine in the industry. If it doesn't have Global Illumination enabled for performance reasones, more than 50% of the games won't have it.

No-one says it is impossible, but maybe it is just impractical.
There are other ways to get global illumination in games. And there's nothing stopping Epic from releasing new updates to the engine like they always have done.
 

KageMaru

Member
Yeah, no. And as far as I know, only Sony's first party has did any meaningful work "to the metal" this generation. And they released alot of their work to share.

This is wrong, but I have a feeling it's pointless to have this discussion, so thanks for saving me the time looking those threads up for you. =D
 

Durante

Member
Of course its how the hardware is designed. Again, it doesn't come out of the thin air. If you design an alu to perform , say a multiply-add ,do you think they just take a leap of faith and say "welp its designed that way, it should work, next" Or do you think they go thru an extensive testing regime to make sure this alu can do these three flops within the clock at an acceptable rate?
If you think they have to go through a "testing regime" for the ALUs by "feeding MADDs" to it once they have an entire GPU on their hands to see if it can "do these three flops within the clock at an acceptable rate" then I really have nothing more to say to you.

Send him my way :). We could discuss thread level parallesim and some of windows pitfalls in that area, scheduling deficiencies, or execution delays, or overall thread management. There's a whole world to explore.
So can you point out any particular deficiencies regarding threads in Windows that lead to the massive overheads you describe and which are not easily circumvented at the application level in a game setting? Or does your knowledge only extend to throwing around buzzwords?
 

Coolwhip

Banned
Would developers be able to turn all these fancy effects on for the PC ports of UE4 games?

Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?
 

RoboPlato

I'd be in the dick
Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?

It seems like SVOGI is out of the engine completely as of right now. Other effects will still be available on PC.
 
Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?

I don't think it's that easy with global illumination. You'd have to "rebalance" the lighting in scenes.
 

Durante

Member
Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?
Not really, no. There's a significant difference in asset creation between using a fully realtime lighting model and one with prebaked lightmaps.
 

Momentary

Banned
It seems like SVOGI is out of the engine completely as of right now. Other effects will still be available on PC.

So does this mean we'll have to wait another console cycle t take advantage of technology that was developed years ago? Speaking of which... will this DirectX Blue be bringing Ray Tracing into the picture? If so we'll still not be able to get any games that take advantage of it due to everything other than PC exclusives (if there will be such a thing) being ports from consoles. Whatever. At least these consoles releases might light a fire under these hardware manufacturers asses and have them stop pushing back release schedule for true next gen tech.
 
Since Durango has been upgraded to 2.4 TFLops with dual APU's, I am guessing it will be able to run this elemental demo at a significantly higher frame rate & fidelity compared to the PS4 version?
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
So does this mean we'll have to wait another console cycle t take advantage of technology that was developed years ago?

No. They might get it working on the PS4, Durango, and PCs in a few years.
 
Sounds good to me. I can't really tell the difference between the two unless I watch them back to back anyway. As usual the Digital Foundry guys hit it out of the park. I'm a little disappointed about SVOGI being cut since that sounds like something truly next gen. Unfortunately it looks like it'll be next next gen.
 
You have no idea what you're going on about, do you?

Where do you think they get the initial flop ratings for their alus? Out of thin air?

And windows overhead is at the thread/process level. You are bordering on cluelessness.

Bwahaha.

You couldn't have picked a worst person to say that to.
 

Zaptruder

Banned
Not really, no. There's a significant difference in asset creation between using a fully realtime lighting model and one with prebaked lightmaps.

(To further expound on what Durante is saying) It would also be difficult to design around the difference in night/day/weather cycle that dynamic GI and prebaked GI gives. That is; you design the game around one fact or the other... even if it were the case of that you could push a button to give you dynamic GI; you'd have to consider things like mission structure (missions and levels are affected by visibility, which is affected by lighting), AI affects (how they respond in dark and light, and all variables between), etc.

It's... a significant shame that we can't rely on a developmental standardization of lighting in the next gen. On the upside, PC exclusive games will show off their superiority that much faster. When combined with next-gen VR, PC gaming is basically going to toss a grenade in the open mouths of console gaming... at least in terms of the level of immersion been achieved.
 
Bwahaha.

You couldn't have picked a worst person to say that to.

Was thinking the same thing.

What's hilarious is that Triple U is saying that Durante has done nothing to suggest he is knowledgeable, while Triple U wants the rest of us to believe him without presenting anything that suggests he knows anything more than a few buzz words he read in the abstract of a CompSci paper.
 

Swifty

Member
You heard it here first guys, that PC optimisation brings pc ahead of consoles.

As usual from digital foundry this entire article reads like a whole lot of pandering to the console only crowd.
Can't wait to see them test all their games on a 7850 on pc 4 years into next gen to have a 'fair' comparison, while the rest of us are playing at 120hz 1600p on gpus 6-10x better.
Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.
 
Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.

Yeah, I thought DF were very balanced in their evaluation.
 

Durante

Member
Was thinking the same thing.
What I really dislike about it is that discussing the relative "knowledge" of posters is completely meaningless as to the actual argument. But when someone doesn't really present any argument and only attacks a post by doubting your knowledge of the subject matter it's hard to make a factual counterpoint.

Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.
Yeah, I also think that DF evaluations are generally very fair. Though I can't agree with their choice of PC hardware for cross-platform evaluations, but that doesn't factor into the equation here.
 
What I really dislike about it is that discussing the relative "knowledge" of posters is completely meaningless as to the actual argument. But when someone doesn't really present any argument and only attacks a post by doubting your knowledge of the subject matter it's hard to make a factual counterpoint.

Fully agree. One's arguments should stand on their own two feet; it's the basis of all of academia. I also agree that when his entire argument is to attempt to discredit your knowledge while providing no proof of his own, then we can bring in the relative knowledge of the posters.
 

Demon Ice

Banned
You have no idea what you're going on about, do you?

Where do you think they get the initial flop ratings for their alus? Out of thin air?

And windows overhead is at the thread/process level. You are bordering on cluelessness.

Did you seriously just try to tell Durante he doesn't know what he's talking about....

Fucking lol.

I wonder how many of the console gamers in this thread actually know what SVOGI stands for.
 
Did you seriously just try to tell Durante he doesn't know what he's talking about....

Fucking lol.

I wonder how many of the console gamers in this thread actually know what SVOGI stands for.
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.
 

jaosobno

Member
Without console you still wouldn't have your true dynamic lighting in UE4 either.

Exactly. Most PCs today are not Core i7 with GTX680 and SVOGI obviously isn't something that you can toggle on/off in game options (otherwise they would leave this feature for usage in high end PCs and disable it for low/medium end configurations and consoles).

Blaming missing SVOGI on consoles makes little sense. Anyone, feel free to correct me if I'm wrong.
 
Still loving how certain PC extremists are blaming consoles...someone mentioned SOVGI was taking more than half the resources of GTX 680, it isn't practical for the market this year. Medium and lower end cards can't run it...why continue to blame PS4? I mean it isn't realistic for everyone to have a GTX 680 card in their PC today...I am sure 2 years down the line Epic will enable the feature for PC gamerz

They HAVE to keep the minimum required specs in mind for PC games.

edit: look at how UE3 was upgraded over the years. Especially in the lighting department
 

Saty

Member
With that approach then you basically say that no technological\gfx advancements would have ever been made\used.

If PC was the only thing around then at some point whoever would have made the jump. Just like how for instance Crysis 3 requires DX11 on PC.
 
Still loving how certain PC extremists are blaming consoles...someone mentioned SOVGI was taking more than half the resources of GTX 680, it isn't practical for the market this year. Medium and lower end cards can't run it...why continue to blame PS4? I mean it isn't realistic for everyone to have a GTX 680 card in their PC today...I am sure 2 years down the line Epic will enable the feature for PC gamerz

They HAVE to keep the minimum required specs in mind for PC games.

edit at how UE3 was upgraded over the years. Especially in the lighting department

The real question is why remove a software feature that will eventually be possible with future hardware? The only reason is to simplify the toolchain to make it easier for code on one platform to work on all supported platforms.
 
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.


horse armor: living up to the name

Still loving how certain PC extremists are blaming consoles...someone mentioned SOVGI was taking more than half the resources of GTX 680, it isn't practical for the market this year. Medium and lower end cards can't run it...why continue to blame PS4? I mean it isn't realistic for everyone to have a GTX 680 card in their PC today...I am sure 2 years down the line Epic will enable the feature for PC gamerz

They HAVE to keep the minimum required specs in mind for PC games.

edit: look at how UE3 was upgraded over the years. Especially in the lighting department

It was not taking up 50% of the resources of a GTX 680. Rather compute functions (GPGPU for all you 8 GB GDDR5er's out there) were taking up 50% of resources. That includes those crazy particle effects showcased among other things. General compute functions are actually used for a lot of graphical "stuff" showcased in the demo.
 
Top Bottom