• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Unreal Engine 4 PS4 vs. PC

I think the post you quoted is quite silly with an obscured point but what is salty about it?

He's salty that post-Durante the PC has by far and away the best version of the game. He's jealous that the 20fps and 720p native resolution of his version cannot be fixed and is a far inferior version of the game. There are no Durantes on the consoles.
 
Tell me a big budget pc exclusive that isn't a mmo or ftp....

studios like From software don't even bother to put some effort on their pc ports and expect those that buy their games to fix them. such great market.

hope you are lucky so you can get to fix Dark souls 2 at-least.....
just recompile your game

Funny because even in its truly unacceptable vanilla form it's still a better version that on consoles.
 
"Must have been a rushed port...", "most likely an early Devkit", "Unreal engine is not that important, Killzone looks better"...
People who had an utopian impression of their beloved future hardware seem to prefer complete denial over recognizing this for what it actually is.

I don't know why you people would expect the PS4 to automatically destroy a pricey high end (yes, I consider my 680 to be high end, Titan is for enthusiasts who stopped caring about quality/price ratios) GPU, but the fact that it performs close to, or even par with, a 680 is very, very impressive. This shouldn't be about who outperforms whom, the pros and cons on both sides won't change. We should be glad that this kind of quality will be the future standard.
Consoles will build up from there and PCs will benefit from having a much higher denominator. When I saw the first videos of Battlefield 3, I was praying that the future console generation will give developers a GPU that is comparable to a gtx 580.
Now they're getting something that comes close to a gtx 680. Cheer the fuck up.

Close to a 660(not 660ti) you mean.
 

Swifty

Member
This was discussed previously in the thread. It seems very unlikely that any single game would be developed to support both, they are too different in principle.
I agree, unlikely, but I wouldn't say it's impossible. A modern game these days will be using artist placed light probes if they wanted global illumination. It's not entirely difficult for an engine to choose not to use light probe data but use a dynamic global illumination method if it's capable. It really does depend on how the environment lighting has been setup by the artists.
 

Durante

Member
I agree, unlikely, but I wouldn't say it's impossible. A modern game these days will be using artist placed light probes if they wanted global illumination. It's not entirely difficult for an engine to choose not to use light probe data but use a dynamic global illumination method if it's capable. It really does depend on how the environment lighting has been setup by the artists.
Yes, it's absolutely possible, but sadly I don't think many multiplatform developers would bother.

This is just a little bit of my gripes but maybe you can glimpse the point. I hope.
What I can glimpse is that you seem to have studied operating systems 101. Good. What I don't see is how the type of scheduling employed at the OS level makes a significant difference in terms of overhead for a game. It only truly enters the equation when there is an external load in addition to the game present on the system (multitasking, in other words), and multitasking is obviously never free, regardless of which OS you are using. In a console capable of multitasking, you'll be running an OS-level scheduler as well, and if you want all your OS functions to be responsive and safe it will have a user-level and a kernel-level, and it will have multiple different per-thread and per-process priority settings. I fail to see the significant difference vis-a-vis a modern PC OS.

Him going on about how "he could code some, probably singled threaded, Cuda program and get close to theo max performance" didn't inspire too much confidence.
That's not what I said. I said that the fact that it is possible to achieve performance near the theoretical limit on graphics hardware using CUDA and OpenCL illustrates that there is no significant OS overhead in that setting. I never mentioned "single-threaded" with a so much as a word (and it would actually need quite a bit of text simply to specify what one even means when saying "single-threaded" on a GPU).
 

KageMaru

Member
LOL wtf is this. If you have a retort, post it. You're not doing me any favors pal. Especially not by dangling supposed info in my face.

This is me realizing you already have your mind made up. I never considered it me doing you a favor, I was just willing to look up some info for a fellow poster and didn't think anything of it. I also am not dangling anything in your face, it was a legit thanks for saving me the time because I really did plan to dig up any info I could find. I already told you where you can search for the info and it's not like I'm pretending to hold some top secret info, so I really don't see it as dangling anything.
 

benny_a

extra source of jiggaflops
He's salty that post-Durante the PC has by far and away the best version of the game. He's jealous that the 20fps and 720p native resolution of his version cannot be fixed and is a far inferior version of the game. There are no Durantes on the consoles.
I didn't know there was a history with Dark Souls.

Funny because even in its truly unacceptable vanilla form it's still a better version that on consoles.
I don't know about Vanilla. I've read about hacking being an issue on PC.

In a console capable of multitasking, you'll be running an OS-level scheduler as well, and if you want all your OS functions to be responsive and safe it will have a user-level and a kernel-level, and it will have multiple different per-thread and per-process priority settings. I fail to see the significant difference vis-a-vis a modern PC OS.
I've had big performance variance on my computers depending on which schedulers I've used on my OS. Why wouldn't that make an impact if you select on that is optimized for certain use cases as is the case in a console environment.
 

Arucardo

Member
He's salty that post-Durante the PC has by far and away the best version of the game. He's jealous that the 20fps and 720p native resolution of his version cannot be fixed and is a far inferior version of the game. There are no Durantes on the consoles.

Didn't Durante himself add postprocessing AA to Nier? :p (albeit through using a capture card) or am I mistaken?
 

Durante

Member
Didn't Durante himself add postprocessing AA to Nier? :p (albeit through using a capture card) or am I mistaken?
That happened. Though I'm actually adding post-AA to any console game, Nier was just a common example I used for demonstration.

Honestly, I'd really appreciate if the thread was less about my person.
 
That happened. Though I'm actually adding post-AA to any console game, Nier was just a common example I used for demonstration.

Honestly, I'd really appreciate if the thread was less about my person.

which company are you working with atm? Do you have a masters in computer science? I am also going for compu science this fall (hopefully)
 

Durante

Member
I've had big performance variance on my computers depending on which schedulers I've used on my OS. Why wouldn't that make an impact if you select on that is optimized for certain use cases as is the case in a console environment.
What schedulers, OS and use case are we talking about here? Obviously you can construct a use case where it makes a huge difference (i.e. running a web server or something like that which creates lots of OS-level threads you'd get more throughput with a throughput-oriented scheduler than the default schedulers in desktop OS use which optimize for responsiveness), I just don't believe that games are one of these use cases where the difference is large.

Game engines generally create one software thread per hardware thread and do their own user-level work scheduling and load balancing on those threads. If you do that on e.g. Windows, and lock the affinity of each of those threads, you'll only get significantly interrupted by the OS scheduler when there is a large external load present -- and on a console, if there were such an external load, it would also need to be processed.
 

v1oz

Member
I think it's been mentioned several times, but coding to the metal doesn't improve performance all that much. Most of the console efficiency comes with the fact that developers only have one set of specs to target and can optimise the rendering pipeline for that setup. Windows and API overheads are not nearly as significant, meaning coding to the metal doesn't make as big a difference.
Really? Then why do some games still have some low level optimizations, for things that require more speed.

Really the decision to go low level is dependent on the problem, in certain cases you may get 10-20% speed-up. If you look at your profiler, you find you have a bottleneck and then you've already minimised all the over heads of the memory sub system. Then you try a SIMD array data implementation of your algorithm, but SIMD just isn't a good match for it. Then intrinsics and assembler level coding may be needed to get better performance.
 

Stallion Free

Cock Encumbered
I did.
They told me 25%

You seem to suggest you know otherwise, so I asked you to share that.

Do you know otherwise? Or was that just...a guess?
You don't seem to understand it apparently. Their indie licensing setup is fantastically low risk. Indies get charged zero royalties and essentially have a free engine if they gross less than 50K. Once they gross more than that they pay a 25% royalty. If an Indie is fortunate enough to gross 2 million than they are paying Epic 500k which is still significantly less than a publisher license. UE3 is well documented has a ton of advanced features beyond what the average indie start could possibly code and costs ZERO dollars up front. Is not a bad choice and a lot of Indies are making it.

Publishers have completely separate deals with Epic which are usually private and decided on a case by case basis. Warner Bros bought a bulk license for their studios while EA most likely licensed it on a game by game basis. I seriously doubt there are royalty fees. It is far more likely that it is an upfront lump sum. These devs get access to a lot more both engine-wise and Epic support staff-wise.
 

Durante

Member
Really? Then why do some games still have some low level optimizations, for things that require more speed.

Really the decision to go low level is dependent on the problem, in certain cases you may get 10-20% speed-up. If you look at your profiler, you find you have a bottleneck and then you've already minimised all the over heads of the memory sub system. Then you try a SIMD array data implementation of your algorithm, but SIMD just isn't a good match for it. Then intrinsics and assembler level coding may be needed to get better performance.
This is a rather realistic workflow and a realistic estimation of the payoff, but in my experience "10-20%" isn't usually what people here expect when they are talking about "coding to the metal".
 

benny_a

extra source of jiggaflops
What schedulers, OS and use case are we talking about here? Obviously you can construct a use case where it makes a huge difference (i.e. running a web server or something like that which creates lots of OS-level threads you'd get more throughput with a throughput-oriented scheduler than the default schedulers in desktop OS use which optimize for responsiveness), I just don't believe that games are one of these use cases where the difference is large.
(I accidentally closed my tab so my old post is gone.)

Back when Con Kolivas notably played around with schedulers on Linux I was trying all kinds of variants and forks and patches out and I've had very big difference in performance and responsiveness.

That's why I couldn't reconcile that a modern OS wouldn't see a significant difference when I on my personal system saw huge difference with the same OS while "just" changing the schedulers.

If games aren't significantly impacted by this that may be the case, but to dismiss it out of hand seemed strange given my personal experience.
 
The issue I see is that a lot people seem to take the demo at face value and imply that it is the be-all-end-all and hardware is simply maxed out now. This demo doesn't have SVOGI, but that doesn't mean we won't see it or something damn similar in the future. I can't wait to see what technology the ICE team for example are bringing into the fold. Once the API's evolve and developers start seeing some of the benefits, I'm sure in the future we'll be playing games on the PS4 (first party stuff, at least) that have a lot of impressive technology in them that we might not even realize without looking for it. I for one, look forward to the PS4 and whatever it brings; weaker than my high-end PC and all. I'll also continue to use my PC for all the crazy shit that I can do with it too. People are just far too invested in making a point on both sides.
 

Triple U

Banned
Yes, it's absolutely possible, but sadly I don't think many multiplatform developers would bother.

What I can glimpse is that you seem to have studied operating systems 101. Good. What I don't see is how the type of scheduling employed at the OS level makes a significant difference in terms of overhead for a game. It only truly enters the equation when there is an external load in addition to the game present on the system (multitasking, in other words), and multitasking is obviously never free, regardless of which OS you are using. In a console capable of multitasking, you'll be running an OS-level scheduler as well, and if you want all your OS functions to be responsive and safe it will have a user-level and a kernel-level, and it will have multiple different per-thread and per-process priority settings. I fail to see the significant difference vis-a-vis a modern PC

Well in windows there's always something going on taking priority.

On a console though, it gets fun. One thing being you don't have to assume. You can allocate resources. Like say the custom OS cpu and/or a jaguar core. You can do your round robin, your multilevel feedback, what ever for the overall system OS/hyper visor . As far as the rest of your resources, explicitly in a RTOS, you could coinplement a fcfs scheduler than has an instance on each core. This second OS would only handle game traffic.

As an example of performance indications of fcfs vs something more standard, a CS program at some school published results of an increase in throughput to the tune of 13.5% in YDL PS3 to 30 on a cell blade. Just from tweaking scheduling among other things. I can link you when I get to my laptop.
 

benny_a

extra source of jiggaflops
The issue I see is that a lot people seem to take the demo at face value and imply that it is the be-all-end-all and hardware is simply maxed out now.
I see it as bad for the same reason the early WiiU ports were considered bad.
A next-gen machine should run these things purely by brute force without optimization.

Of course one has to keep in mind this was specifically created as a target that Epic hoped that next-gen consoles would hit, rather than current-gen games as in the WiiU example above.

I mean after Infiltrator allegedly runs on UE4 without SVOGI it only now sucks for the game creators because it uses pre-baked lighting again but I as a customer think that Infiltrator looks massively better than the Elemental demo. (Infiltrator > Samaritan > Elemental from my perspective.)
 
It still doesn't make sense to me why they pulled SVOGI. There are many graphical features only available to the highest level of graphics cards out there.

Perhaps you can't have to lighting engines in a single game? One more powerful PCs (SVOGI) and another (Static GI?) for mid & lower end machines? I can definitely see that being a huge burden on developers.
 
It's disappointing to hear that the PS4 couldn't handle this demo properly.

It's already a bit sad knowing that UE4 games on either system will not look as good as the demo to begin with - history showing that demos are cheating a bit because they don't really have to process other computational elements like AI and physics.

What's worse is that the demo wasn't really visually impressive anyway.

Here's hoping that Capcom's Deep Down engine is more impressive than this, but I'm not convinced by the way it was shadily presented at the PS4 meeting.

Yep, not even the first time i saw it.
 

USC-fan

Banned
It still doesn't make sense to me why they pulled SVOGI. There are many graphical features only available to the highest level of graphics cards out there.

Perhaps you can't have to lighting engines in a single game? One more powerful PCs (SVOGI) and another (Static GI?) for mid & lower end machines? I can definitely see that being a huge burden on developers.

Stuff was cut from EU3. No different this thing.

Most likely the SVOGI performance outweigh the visual gains. Doesnt help that nvidia and amd are more focus on mobile than gpus. Its why they just refresh card instead of putting out new ones.

It will be 2014 until we get real new gpus.

You would need titan power to really run this. A 680 ran it at 1080p at 30FPS. Which no pc gamer could live with.
http://www.unrealengine.com/files/misc/The_Technology_Behind_the_Elemental_Demo_16x9_(2).pdf

That is also just a tech demo. A real game would be more taxing...
 

Jonm1010

Banned
And before the demo he clearly said that it was running on PS4 hardware. I interpret his second statement as meaning 'we want FF to look like this this gen'.
After the Final Fantasy 7 and then 13 tech demos Square showed us last gen I'm gonna wait til I see actual in hand gameplay footage before I even think about trusting that what square showed is actually going to materialize at that quality on ps4.
 

BibiMaghoo

Member
You don't seem to understand it apparently. Their indie licensing setup is fantastically low risk. Indies get charged zero royalties and essentially have a free engine if they gross less than 50K. Once they gross more than that they pay a 25% royalty. If an Indie is fortunate enough to gross 2 million than they are paying Epic 500k which is still significantly less than a publisher license. UE3 is well documented has a ton of advanced features beyond what the average indie start could possibly code and costs ZERO dollars up front. Is not a bad choice and a lot of Indies are making it.

Publishers have completely separate deals with Epic which are usually private and decided on a case by case basis. Warner Bros bought a bulk license for their studios while EA most likely licensed it on a game by game basis. I seriously doubt there are royalty fees. It is far more likely that it is an upfront lump sum. These devs get access to a lot more both engine-wise and Epic support staff-wise.

That was all you needed to say two posts ago, without being condescending.
 

Apenheul

Member
In my line of work 'coding to the metal' is not synonymous to low level optimizations. I work for a hardware manufacturer as a driver/SDK software engineer, and what's low level to most game engine developers is quite high level in my domain. Now it's been more than five years since I've worked on a console game engine, but the aforementioned 10 - 20% performance benefit from low level optimization can only be expected in very specific areas. It's not going to make the game look better or run significantly faster unless you're really CPU limited.
 

StevieP

Banned
In my line of work 'coding to the metal' is not synonymous to low level optimizations. I work for a hardware manufacturer as a driver/SDK software engineer, and what's low level to most game engine developers is quite high level in my domain. Now it's been more than five years since I've worked on a console game engine, but the aforementioned 10 - 20% performance benefit from low level optimization can only be expected in very specific areas. It's not going to make the game look better or run significantly faster unless you're really CPU limited.

Just wanted to highlight this post and thank its author for some clarity.
 

Triple U

Banned
In my line of work 'coding to the metal' is not synonymous to low level optimizations. I work for a hardware manufacturer as a driver/SDK software engineer, and what's low level to most game engine developers is quite high level in my domain. Now it's been more than five years since I've worked on a console game engine, but the aforementioned 10 - 20% performance benefit from low level optimization can only be expected in very specific areas. It's not going to make the game look better or run significantly faster unless you're really CPU limited.
You care to expand?

And most developers don't really get access to lowlevel libraries.
 
Have you watch the FF7 HD lately?

That look bad compare to games today.

http://www.youtube.com/watch?v=IVCYy8C5Av4

I dunno. The clothes are a lot better than pretty much any released game for Ps360, the hair is also among the best and the scenery is fairly detailed and bigger than the usual corridor you'd find on similarly looking games...

This demo may not outclass everything released on this generation, but there are definitely tidbits in it that was never achieved in games...
 
I dunno. The clothes are a lot better than pretty much any released game for Ps360, the hair is also among the best and the scenery is fairly detailed and bigger than the usual corridor you'd find on similarly looking games...

This demo may not outclass everything released on this generation, but there are definitely tidbits in it that was never achieved in games...

You should blame Sony or nVidia for that, at that time PS3 dev kits were equipped with SLI GPU's, while the final card was significantly weaker.
 

USC-fan

Banned
I dunno. The clothes are a lot better than pretty much any released game for Ps360, the hair is also among the best and the scenery is fairly detailed and bigger than the usual corridor you'd find on similarly looking games...

This demo may not outclass everything released on this generation, but there are definitely tidbits in it that was never achieved in games...
Really doesn't look good.

Its very dated... nothing is impressive anout it.
 

Demon Ice

Banned
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.

Low!! Man, you need to stop listening to PC evangelists. I can guarantee you that PCs won't even smell 8GB of GDDR5 RAM for at least 4-6 years, PS4 is simply on another level to what is expected of PC. We've had many devs espouse the benefits of 8GB GDDR5 RAM and how it will revolutionise gaming, how many have you heard say the same about the nvidia Titan? Exactly.

This is master level trolling. Well done.


Yeah I did. Again, the name Durante doesnt mean a damned thing to me. Its not particualy insulting but please drop the shtick like im arguing with some Carmack type techgod.

Him going on about how "he could code some, probably singled threaded, Cuda program and get close to theo max performance" didn't inspire too much confidence.

Triple U
Unfunny.
Unintelligible.
Unreadable
 

kingkaiser

Member
He's salty that post-Durante the PC has by far and away the best version of the game. He's jealous that the 20fps and 720p native resolution of his version cannot be fixed and is a far inferior version of the game. There are no Durantes on the consoles.

Well, in my opinion that's actually one of the best things about console gaming. You have the absolute certainty that everyone of your console friends has the same experience as you.

It is like living in perfect working communistic system. Sure, you have to deal with the lowest common denominator, but at the other hand you can just enjoy the game without going mental with the knowledge that you may even get a "better" experience by throwing more money at your system.

I joined PC gaming after a ten year abstinence at the really slow going end of these console generation. I built myself a very cost-efficient gaming PC, and sure, playing all these console games in 1080p and with 60FPS is kinda cool. But still there are plenty of these more demanding games where I have to limit the settings to achieve a nice framerate.

And that's the moment I get this quite uncomfortable feeling, what if...?

Playing on a console I never give a shit about framerates, image quality and stuff like this because here you just get what you see and no .ini file tweaking or upgrading will change a bit.

In other words...Ignorance is bliss.
 

AzaK

Member
Just saw this posted in another thread by Dragon1893. Interesting watching the video and the article is pretty good too.

Interesting.

The things I noticed on the PS4 version were softness, less smoothness of "round" surfaces and some aliasing in some places. However it's pretty damned impressive and I would not likely notice/care if I was playing it.
 
Did that infiltrator demo use SVOGI? That's the real questions, because it looked like Final Fantasy The Spirits Within...who cares about SVOGI, if UE4 games look that good on PC?!

No, that's why they were able to show much more with the demo on a single GTX680 than the elemental demo that almost sapped it dry showing absolutely fucking nothing.
 

Sid

Member
Is the omission of GI mainly due to the less than required power of the consoles or is it because it is hard to implement? both?
 
Is the omission of GI mainly due to the less than required power of the consoles or is it because it is hard to implement? both?

Look at the Elemental PC demo they showed with SVOGI.

Then look at the Infiltrator Demo...

Both use GTX 680s...

But the Elemental Demo was being run dry and achieving almost nothing in comparison to the very impressive infiltrator demo.

Don't let PC elitists trick you, the 680 is still a 500 dollar GPU, and Epic doesn't want the GPU to kill itself doing nothing when they could do so much more (overall) with the same power.
 

Sid

Member
Look at the Elemental PC demo they showed with SVOGI.

Then look at the Infiltrator Demo...

Both use GTX 680s...

But the Elemental Demo was being run dry and achieving almost nothing in comparison to the very impressive infiltrator demo.

Don't let PC elitists trick you, the 680 is still a 500 dollar GPU, and nVidia doesn't want the this GPU to kill itself doing nothing when they could do so much more (overall) with the same power.
Right,thx for clearing that up.
 
Top Bottom