• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Unreal Engine 4 PS4 vs. PC

Triple U

Banned
I think it's been mentioned several times, but coding to the metal doesn't improve performance all that much. Most of the console efficiency comes with the fact that developers only have one set of specs to target and can optimise the rendering pipeline for that setup. Windows and API overheads are not nearly as significant, meaning coding to the metal doesn't make as big a difference.

Thats patently and fundamently false. If its been mentioned so much, I'd love to see some posts to challenge. Im gonna just go with very few people here and elsewhere know exactly what "coding to the metal" is and even fewer understand the implications on performance.
 
They should do Unreal Engine 4 PS4 vs Wii-U. They'd be interesting.

v8HpGXT.jpg

Except it can run on the Wii U, and someone probably will make it run.

Thats patently and fundamently false. If its been mentioned so much, I'd love to see some posts to challenge. Im gonna just go with very few people here and elsewhere know exactly what "coding to the metal" is and even fewer understand the implications on performance.

Carmack himself clarified his 2x comment saying the gains come mainly from one spec iirc. It's also been shown by many (check Resource Monitor yourself for example) that the Windows and DX overheads are small and the coding to the metal that bypasses them does not result in amazing gains.
 

Ce-Lin

Member
I didn't call you ridiculous, I called the supposition that UE4 would crash when run on something other than a 680 and an i7 ridiculous.

hold on, I didn't say that and you know it, I said *that specific demo* not UNREAL ENGINE 4 as a whole

my words:

also that PC UE4 demo was carefully crafted to take advantage of the hardware it was going to run on as a show off (i7 + 680) I'm pretty sure it would crash the very instant you swapped that card with a 78XX GPU or a different CPU.

why? it's not the first time that PC software crashes or shows buggy performance on some hardware while running flawlessly on something else, actually is rather common. Here we go ignoring the basics of PC gaming. That demo was carefully made to start creating hype around UE4, it's only logical that they took full advantage of that 680 GPU and i7 CPU.

, look at Tomb Raider, released not long ago, one hardware manufacturer didn't have proper access to the game code and the game was crashing running on their GPUs until new drivers were provided... why are you ignoring the facts ? it could be very well the case that the UE4 PC demo is tailored to that GTX 680 to the point using some other hardware will crash with it

I made very clear three times that I was referring to that *specific demo*, not the Unreal Engine 4 as a whole, it's pretty obvious so please stop putting words I didn't say in my mouth, thanks.
 
Coding to the metal can do wonders, I think the best example of that is GTAIII vs Jak 2 on PS2. I think the main programmer at Naughty Dog at the time, wrote an entirely new programming language for the Jak series :O and his team took maximum advantage of PS2 even the BC chip...
 

erick

Banned
Epic isn't exactly an authority on what will show up this gen.

I'm fairly certain that UE4 engine will be used by even more devs for next gen than UE3 was/is used for current gen.

It is the de facto 3rd party game engine in the industry. If it doesn't have Global Illumination enabled for performance reasones, more than 50% of the games won't have it.

No-one says it is impossible, but maybe it is just impractical.
 

Triple U

Banned
I'm fairly certain that UE4 engine will be used by even more devs for next gen than UE3 was/is used for current gen.

It is the de facto 3rd party game engine in the industry. If it doesn't have Global Illumination enabled for performance reasones, more than 50% of the games won't have it.

No-one says it is impossible, but maybe it is just impractical.

I'd argue that it would be less. Unless there is a real boom of indie and mid-tier developers. Seeing as most AAA studio thats used UE has announced custom solutions for next-gen.
 
Source? Genuinely curious there's so many varying statements on the matter.

Windows has a small footprint and its easy to see. Open up Resource Monitor and CPU usage will be under 5%, RAM usage a mere 3GB or less (out of 16GB usually), and GPU usage at 0-1%.

For API overheads Ill have to search for the article I read on it.
 
I think it's been mentioned several times, but coding to the metal doesn't improve performance all that much. Most of the console efficiency comes with the fact that developers only have one set of specs to target and can optimise the rendering pipeline for that setup. Windows and API overheads are not nearly as significant, meaning coding to the metal doesn't make as big a difference.

Whilst developers have certainly bullshitted about the power of hardware ("Wii U is the best version", PS3 claims ad infinitum) more tech guys than John Carmack have stated that there's a pretty big boost from "coding to the metal" - such as Nvidia's Tim Lottes, who has no reason to lie about the capabilities of AMD-powered hardware. The issue is that "coding to the metal" is not some kind of magic fairy juice that breaks the laws of physics and overrides power/heat limitations and such. Not to mention that the raw grunt of consoles is not that amazing compared to a gaming PC. There's nothing wrong with extolling the benefits of console development, it's just taken to wild extremes.
 

LiquidMetal14

hide your water-based mammals
I'm not worried a bit since the HW is only in its infancy.

I am also not worried since I will be able to run these games with less compromise on PC.

But this seems a bit premature to be downplaying the PS4 iteration due to final HW not even being available yet and the maturity of the platform.
 

Triple U

Banned
Windows has a small footprint and its easy to see. Open up Resource Monitor and CPU usage will be under 5%, RAM usage a mere 3GB or less (out of 16GB usually), and GPU usage at 0-1%.

For API overheads Ill have to search for the article I read on it.

If thats your source than im lmao.
 

cheezcake

Member
Windows has a small footprint and its easy to see. Open up Resource Monitor and CPU usage will be under 5%, RAM usage a mere 3GB or less (out of 16GB usually), and GPU usage at 0-1%.

For API overheads Ill have to search for the article I read on it.

Yeh Windows is obvious, it's mainly the DirectX overhead I'm interested in. I remember reading an article a while ago saying generally consoles are capable of far more draw calls than PC's due to directx overhead (or something along those lines).
 

KaiserBecks

Member
What kind of logic is this? You don't need insane resolutions and 8xAA and 120fps to be impressed by a game's visuals.

I don't need that in order to be impressed, but I need 1080p (which really isn't insane and hopefully will become a standard at last), 4xAA and 60fps in order to be satisfied. Coquetting a game's visuals on one hand and neglecting image quality on the other doesn't make any sense.


Do you know how many here on GAF were impressed by games like God of War 3 and Ascension, Uncharted 3, Killzone 3, Gears 3, Halo 4, Forza, etc?

How are these games doing anything that hasn't been done before?
 
Wasn't the removal of prebaked lighting supposedly one of the big cost savers for next gen development? Doesn't the fact that we are going straight back to pre-baked lighting mean an increase to what developers were expecting to pay for development this gen?
 

Durante

Member
Windows overhead for most purposes is effectively 0. This is easy to confirm by running a low-level benchmark and checking how much of the theoretical system performance you can use.

hold on, I didn't say that and you know it, I said *that specific demo* not UNREAL ENGINE 4 as a whole
The demo runs on UE4. I fail to see the importance of the distinction, unless your argument is that they somehow developed an independent, incompatible build just for the demo. I think that's unlikely.
 
These "pc extremists" understand what you don't: The high-end card of today is the mid-range card of tomorrow.

Burned

Really? Some of the biggest and best UE3 games were made by EA last gen and now they transitioned over to Frostbite 3.0...

Square Enix = Luminous
Capcom = Panta Reih
Sony = Ice
Crytek = Cry Engine 3(lol)
EA = Frostbite 3.0
Activision = Developing their own engine ATM
Bethesda = Gamebyro lol
Microsoft = well they will get an exclusive ip from Epic, one from crytek, and probably work on an inhouse engine for Halo, Kinect games, Forza
Konami = Fox Engine
Valve = Source 2.0

who is left?

Indie developers.
 
Man, I did not expect it to be so much of a difference... :[

This kinda diminishes the thought I had that we would be playing amazing looking PS4 games running in 60 FPS.
 

Saberus

Member
From the article:

((we know for a fact that most studios only received final dev kits in the weeks beforehand))

This is incorrect.. nobody had final Dev kits yet, if they did.. everyone would of known about the 8 gigs of ram.
 

Triple U

Banned
Windows overhead for most purposes is effectively 0. This is easy to confirm by running a low-level benchmark and checking how much of the theoretical system performance you can .
If you mean something like the Pi tests then no they're not that accurate.

Nvidia feeds the card nothing but straight MADDs to determine a theo max.
 

KaiserBecks

Member
Kaiserbecks: Well, that is a funny thing to say. But for a start, GoW3 used MLAA extremely efficiently.

It did a lot of cool things and I'd consider it to be the greatest looking PS3 title (haven't seen the new one yet though). But isn't that what you'd expect from something that is essentially a PS2 game with a new engine?
 
It did a lot of cool things and I'd consider it to be the greatest looking PS3 title (haven't seen the new one yet though). But isn't that what you'd expect from something that is essentially a PS2 game with a new engine?

A PS2 game ???
The scale and animation in GOW3 could not be done on PS2 .
 

Durante

Member
Nvidia feeds the card nothing but straight MADDs to determine a theo max.
Nvidia doesn't "feed" the card anything to determine the theoretical maximum, that's why it's called theoretical. It's determined by simple arithmetic.

The fact that you can write a CUDA or OpenCL dense matrix multiply program that reaches close to these theoretical maxima illustrates that Windows and API overheads are not nearly as significant as some people assume.
 

Stallion Free

Cock Encumbered
Wasn't the removal of prebaked lighting supposedly one of the big cost savers for next gen development? Doesn't the fact that we are going straight back to pre-baked lighting mean an increase to what developers were expecting to pay for development this gen?

Being able to see lighting changes immediately in the editor is/was going to be a massive time saver for devs, yes. Baking light in UE3 can't take quite a while depending on the level and lighting complexity. Epic has not said if the loss of real-time GI effects the dev tools (requiring re-baking every time you want to see a change) or if levels will just bake the lighting at the end with the editor allowing for real-time GI during the level building process.
 

jaosobno

Member
Losing real time GI is such a blow.

I was looking forward to it.

It's not like UE4 is the only thing happening on PS4. Doesn't Enlighten Engine support dynamic radiosity on PS4?

PlayStation®4 promises a dramatic increase in memory and compute resource, which will free developers to unleash the full creative power of dynamic lighting and to finally extend cinematographic film practices to dynamic immersive worlds.

Sure it also offers the option to go for prebaked lightmaps (or a combo), but I definitely believe we will see some kind of realtime GI on PS4 (most likely in a form of prebaked/dynamic-in-certain-areas combo).
 

jWILL253

Banned
A few things:

1. I've already seen a couple posts in here compare and correlate the PS4 to the WiiU. Stop it. Right now.

2. It seems that Epic isn't all that efficient when working on consoles, especially Sony consoles. At this point, I'm not even checking for Epic outside of the Xbox/PC space.

3. As for the demo itself, Capcom's new engine looks a whole lot better than UE4, as does the Luminous Engine & Frostbite 3. As one poster brought up, there are many devs that are creating their own engines from scratch, and they all seem to be making better strides in terms of graphics than Epic is.

4. I always see the same pro-PC posters in every PS4 thread trying to piss in people's Wheaties. You all can try to act like you guys have no bias or that you're just being logical, but nah... we get it. PC will always have an edge over closed boxes. However, that's completely irrelevant due to the varying range of hardware setups. And the fact of the matter is, most games are going to be console-centric, then scaled-up for PC. So, very few games are gonna take advantage of a high-end PC setup. I wish more per would see that, but everyone is too busy waving their dicks in a circle to acknowledge that.

Just so you guys know where my bias is: I'm a pro-PlayStation guy. I grew up on Sony PlayStation. That said, I want ALL platforms to succeed. All this picking sides over what amounts to be irrelevant in the grand scheme of things, makes it very hard to appreciate this hobby of mine. Kinda difficult to do that when everybody is telling you that what you like SUCKS...
 
Being able to see lighting changes immediately in the editor is/was going to be a massive time saver for devs, yes. Baking light in UE3 can't take quite a while depending on the level and lighting complexity. Epic has not said if the loss of real-time GI effects the dev tools (requiring re-baking every time you want to see a change) or if levels will just bake the lighting at the end with the editor allowing for real-time GI during the level building process.

Hmm hmm I see. Thanks for clearing that up. So at this point it's a big "maybe" til we know how the tools actually work.
 

Apenheul

Member
Thats patently and fundamently false. If its been mentioned so much, I'd love to see some posts to challenge. Im gonna just go with very few people here and elsewhere know exactly what "coding to the metal" is and even fewer understand the implications on performance.

So what do you think coding to the metal is? Hardly anyone really codes to the metal anymore, especially in game development since it's actively discouraged when using 1st party libraries.
 
Not hate, just cynicism. No one likes being told that 1.8 Tflop hardware is as good as/better than hardware that produces 3.5+ Tflops unless they are so emotionally invested in seeing the PS4 succeed that they'll just nod blithely.

The marketing men are saying nice things to make you buy it because their business model relies on you buying it.

Lol, fantastic way to put that in perspective! What you are saying is that any talk to the contrary is blind delusion. I am not going to post the carmack post that has been posted to death already but people need to understand some very simple things.

There will be a point in time where at a certain resolution (let's say 1080p) after optimization, where there is going to be a real good chance that the PS4 will match something that the 680 can produce. The difference will come in terms of scaling. Which the PS4 most likely will not be able to do because of pure power of 680. But when devs are working on a fixed system for a targeted goal in mind it is not far fetched to believe that it can produce a visual result similar to that of a more powerful PC card because there is simply alot of different factors that come into play before the final output is generated.

It has never been black and white, and any glance to demos and games at the start of the generation versus what they achieve mid to end of a generation on consoles should prove again and again that judging the power of a console at the start of its life ,much less before it is released, is just sillly.
 

KaiserBecks

Member
A PS2 game ???
The scale and animation in GOW3 could not be done on PS2 .

Hence "PS2 game with a new engine". Unless I've been doing something severely wrong, God of War 3 didn't play noticeably different than God of War 2. I expect new experiences from a new generation. More violence and bigger giants or "scale" if you really want to call it that, isn't going to cut it.
 

Lord Error

Insane For Sony
As far as I know, their realtime GI was using up more than half the computational resources on 680GTX. Which means that it would be unreasonable to use it even if your hardware target was GTX680. You'd never be able to build a proper impressive game with it if the lighting alone takes up so much.

No surprise they removed this from their next 680GTX demo, and made it a hell of a lot impressive looking at the same time, because they could use power for everything else now that lighting alone wasn't taking up such huge chunk of resources.

I'm really curious if PS4 could pull that demo off 1:1 in 720p, or dynamic 1080p. With the level of antialiasing they had in it, I'd gladly take that over severely compromised fixed 1080p.
 

nib95

Banned
My guess is that on final hardware (not the alpha kits) and a bit of time, Epic would be able to replicate the PC version of the demo. But that's just a hunch. Launch stuff on consoles always gets obliterated later down the line when devs get to grips with the hardware tools.
 

Triple U

Banned
Nvidia doesn't "feed" the card anything to determine the theoretical maximum, that's why it's called theoretical. It's determined by simple arithmetic.

The fact that you can write a CUDA or OpenCL dense matrix multiply program that reaches close to these theoretical maxima illustrates that Windows and API overheads are not nearly as significant as some people assume.
You have no idea what you're going on about, do you?

Where do you think they get the initial flop ratings for their alus? Out of thin air?

And windows overhead is at the thread/process level. You are bordering on cluelessness.
 

Stallion Free

Cock Encumbered
As far as I know, their realtime GI was using up more than half the computational resources on 680GTX. Which means that it would be unreasonable to use it even if your hardware target was GTX680. You'd never be able to build a proper impressive game with it if the lighting alone takes up so much.

No surprise they removed this from their next 680GTX demo, and made it a hell of a lot impressive looking at the same time, because they could use power for everything else now that lighting alone wasn't taking up such huge chunk of resources.

The 680 is going to be old by the time UE4 is in a single released game. If the rumors of Nvidia's next chip line end up being true, is the GI going to make anywhere near the same size dent?
 
My guess is that on final hardware (not the alpha kits) and a bit of time, Epic would be able to replicate the PC version of the demo. But that's just a hunch. Launch stuff on consoles always gets obliterated later down the line when devs get to grips with the hardware tools.

I'm pretty sure that final kits are able to reach the same visual fidelity. But I heard somewhere that the real 'final' kits are coming only Q3...
 

ElfArmy177

Member
People that care about tech and what the new consoles can do?

except that the "demo" looks like shit regardless of PC or consoles... In relation to cryengine, frostbite, and more recent demos from unreal...

wtf is so special about the elemental demo?? Bland bullshit is all it is
 

iavi

Member
The 680 is going to be old by the time UE4 is in a single released game. If the rumors of Nvidia's next chip line end up being true, is the GI going to make anywhere near the same size dent?

That's the thing; by time these consoles actually release the 680/7970 will be the lower high-end/mid-range in comparison to the next wave from AMD/Nvidia. The performance gap's already there, it's going to be even larger by release.
 

Stallion Free

Cock Encumbered
except that the "demo" looks like shit regardless of PC or consoles... In relation to cryengine, frostbite, and more recent demos from unreal...

wtf is so special about the elemental demo?? Bland bullshit is all it is

You seem to be talking about art, not engine features.
 
As far as I know, their realtime GI was using up more than half the computational resources on 680GTX. Which means that it would be unreasonable to use it even if your hardware target was GTX680. You'd never be able to build a proper impressive game with it if the lighting alone takes up so much.

No surprise they removed this from their next 680GTX demo, and made it a hell of a lot impressive looking at the same time, because they could use power for everything else now that lighting alone wasn't taking up such huge chunk of resources.

If that was the case EPIC would have to be crazy to use that method unless they thought next gen system would have a better GPU .
If that is what they were thinking i want what they are smoking.
 
Are they going to release a PC demonstration without real time global illumination?

Its been removed from ue4 entirely, so that includes PC. This makes it a very very very unfair comparison. the PC demo is a sham at the moment because even the PC will not be getting the demo that was shown for PC

this is why I hate the ue4 comparison. Its apples to oranges cuz nobody is getting the original demo version from PC...not even the PC.
 
Wasn't the removal of prebaked lighting supposedly one of the big cost savers for next gen development? Doesn't the fact that we are going straight back to pre-baked lighting mean an increase to what developers were expecting to pay for development this gen?
ue4 removed global illumination for all levels of tue engine including PC. This doesn't mean squat for anything outside of ue4. There are other lighting engines out there and other methods to light up a game. See fox engine with its realistic lighting.

Ue4 /= features the console is capable of

Other engines exist and there are other ways to get realistic lighting without baking
 

USC-fan

Banned
I think it's been mentioned several times, but coding to the metal doesn't improve performance all that much. Most of the console efficiency comes with the fact that developers only have one set of specs to target and can optimise the rendering pipeline for that setup. Windows and API overheads are not nearly as significant, meaning coding to the metal doesn't make as big a difference.

well really you dont "code to the metal" it just a lot lower API in console. Coding on low level api and one spec together really make all the difference.

'It's funny,' says AMD's worldwide developer relations manager of its GPU division, Richard Huddy. 'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.'

So what sort of performance-overhead are we talking about here? Is DirectX really that big a barrier to high-speed PC gaming? This, of course, depends on the nature of the game you're developing.

'It can vary from almost nothing at all to a huge overhead,' says Huddy. 'If you're just rendering a screen full of pixels which are not terribly complicated, then typically a PC will do just as good a job as a console. These days we have so much horsepower on PCs that on high-resolutions you see some pretty extraordinary-looking PC games, but one of the things that you don't see in PC gaming inside the software architecture is the kind of stuff that we see on consoles all the time.

On consoles, you can draw maybe 10,000 or 20,000 chunks of geometry in a frame, and you can do that at 30-60fps. On a PC, you can't typically draw more than 2-3,000 without getting into trouble with performance, and that's quite surprising - the PC can actually show you only a tenth of the performance if you need a separate batch for each draw call.

But it's still very hard to throw tremendous variety into a PC game. If you want each of your draw calls to be a bit different, then you can't get over about 2-3,000 draw calls typically - and certainly a maximum amount of 5,000. Games developers definitely have a need for that. Console games often use 10-20,000 draw calls per frame, and that's an easier way to let the artist's vision shine through.'

Wrapping it up in a software layer gives you safety and security,' says Huddy, 'but it unfortunately tends to rob you of quite a lot of the performance, and most importantly it robs you of the opportunity to innovate.'
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/ <-really the whole article is a good read

But John Carmack is really the one for years talk about how bad API in PC are...

Where on the consoles we just say &#8220;we are going to update this one pixel here,&#8221; we just store it there as a pointer. On the PC it has to go through the massive texture update routine, and it takes tens of thousands of times [longer] if you just want to update one little piece. You start to advertise that overhead when you start to update larger blocks of textures, and AMD actually went and implemented a multi-texture update specifically for id Tech 5 so you can bash up and eliminate some of the overhead by saying &#8220;I need to update these 50 small things here,&#8221; but still it&#8217;s very inefficient.

I don't worry about the GPU hardware at all. I worry about the drivers a lot because there is a huge difference between what the hardware can do and what we can actually get out of it if we have to control it at a fine grain level. That's really been driven home by this past project by working at a very low level of the hardware on consoles and comparing that to these PCs that are true orders of magnitude more powerful than the PS3 or something, but struggle in many cases to keep up the same minimum latency. They have tons of bandwidth, they can render at many more multi-samples, multiple megapixels per screen, but to be able to go through the cycle and get feedback... &#8220;fence here, update this here, and draw them there...&#8221; it struggles to get that done in 16ms, and that is frustrating.

Back in the DOOM era of game development, high-speed graphics were coded at the register level (basically in machine language). Then, as development moved from DOS to Windows and code started being done through APIs, game development became more abstract and higher-level. Carmack remembers a frustrating time when there were literally 20 different graphics chips that game makers had to code for, but nowadays there's only two to worry about. While coding through these layers is sometimes easier, it can also be frustrating. Carmack often finds himself wondering if a mistake was "my fault, the drivers fault, or the hardware's fault?"

But the Xbox 360 was designed to have a very thin API layer. In Carmack's words, he can "basically talk directly to the hardware ... doing exactly what I want."

http://www.pcper.com/reviews/Editor...-Graphics-Ray-Tracing-Voxels-and-more/Intervi

Just think about in PC gaming just a driver update can give you 30% performance improve in a game.

The 680 is more double the performance of the gpu in the ps4. Epic engine is also DX based while the ps4 api is not. I wouldnt be shock to see this demo run better on x720 hardware just base on that. Until epic has more time to port this engine over. Like epic said their engine are ready until they release a game on that platform.
 
Why is the PS4 getting all the blame for the removal of SVOGI and the overall degradation? I'm sure the rumored 1.2TF GPU in the Durango is going to match 1-to-1 with the PC demo.
 
It's absurd that PC gamers are held up as the bastion of rationality. Those that self-identify as this are equally susceptible to human cognitive biases as any other random group.

Sweeping, extreme generalizations are always the wrong way to go. That said, it stands to reason to assume that the average PC gamer knows more about technology than the average console gamer. The nature of the PC platform is such that it requires a certain amount of know-how.

So, saying that PC gamers are the bastion of rationality is both wrong in a sense and right in a sense. Wrong because such a diverse group cannot be 'summarized' by a single sentence, right because a random PC gamer is statistically more likely to have a better grasp of gaming technology.
 
Top Bottom