• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Unreal Engine 4 PS4 vs. PC

Ce-Lin

Member
what is going to happen when the 7 series of cards come out?? its going to be a slaughter surely.

as it's always been, specially for your wallet if you're always trying to keep with the high-end PC hardware. 2014 GPUs are not the point of the comparison here, Eurogamer is trying to compare current high-end PCs with the tech inside PS4/Durango (most probably cheaper consoles than whatever PC the UE4 demo was running on), read the OP carefully before replying, I'm a junior but know the rules here.

then again, why not running that UE4 demo on a 7850 or 7870 card ? aren't they more in line with PS4/Durango tech than a high-end GTX 680 (?)
 
dont blame ps4, not every pc gamer has a gtx 680...its not going to run on low or mid range gpus either. its funny how these pc extremists keep forgetting about, "minimum requirements"

These "pc extremists" understand what you don't: The high-end card of today is the mid-range card of tomorrow.
 

Clear

CliffyB's Cock Holster
SniperHunter said:
dont blame ps4, not every pc gamer has a gtx 680...its not going to run on low or mid range gpus either. its funny how these pc extremists keep forgetting about, "minimum requirements"

There's also that UE is just a technology base, its rarely used without significant modification. What Epic are showing at this stage is a vanilla implementation, actual third-party usage may be substantially more advanced.
 

ghst

thanks for the laugh
what is going to happen when the 7 series of cards come out?? its going to be a slaughter surely.

in a couple of years the bleeding edge will be raining hot piss down all over tech comparisons with GPUs packing 20nm tech and stacked RAM.

if you tie your hopes and dreams to mass market products of fixed spec compromise, i suggest you learn a little humility or bring an umbrella.
 
dont blame ps4, not every pc gamer has a gtx 680...its not going to run on low or mid range gpus either. its funny how these pc extremists keep forgetting about, "minimum requirements"

you know there ways to turn off certain engine features right?
Just because a game has to cater to minimum spec does not mean that spec has to be able to render all features baseline.

He is referring to the fact that gfx technology will be held back because the PS4 as a minimum and dominant spec (games are primarily made for consoles in multi-platform dev).
There by cutting out the chance of an incredibly interesting chance to have gfx technology advance. The chance for PC graphics to advance will be lessened and will stagnate even more in 1 to 2 years time...
Let alone 3 or 5 years where PCs will almost unrecognizably powerful in comparison to their little console brothers... but will still be rendering these console up-ports at 100s of fps.
 
Those 8GBs didn't seem to make a difference in this case. (I know Epic may not have access to an 8GB devkit). So many PS fanboys (for lack of a better term) seem to think that the RAM is the component doing all the graphics rendering.
 
Wow! Looks great. Video footage of the PS4 version looks way better than the still image that has been passed around to inflame fanboy-ism.

Can't wait for the next Elderscrolls on UE4.

On PC, of course :)
 
Wow! Looks great. Video footage of the PS4 version looks way better than the still image that has been passed around to inflame fanboy-ism.

Can't wait for the next Elderscrolls on UE4.

On PC, of course :)

Bethesda use their own (and incredibly buggy) engine, not UE. With off the shelf PC components Bethesda shouldn't have as much a problem porting the game to PS4 as they did with the PS3.
 
Wow! Looks great. Video footage of the PS4 version looks way better than the still image that has been passed around to inflame fanboy-ism.

Can't wait for the next Elderscrolls on UE4.

On PC, of course :)

I would not be surprised if the next elderscrolls continued on gamebryo. If they were handed off an idtech (due to Zenimax internal engine stuff) their incompetent tech team would probably not be able to produce a game.

I wish I was kidding.. but based upon Skyrim... I do not expect much in the way of advance for that team.
 
Not really a bit surprise, but yeah, the PS4 tools are still very early and the 8GB will be siginificant in other sectors than just graphics.

I will probably rock my PC for multiplats and PS4 for exclusives. But you just know there will be badly optimized multiplats that run like crap on PC. (Even with my 680). So the PS4 will be the better choice in some cases.
 
whatever helps you guys sleep at night.....i am just glad ue4 wont be used as much this gen. good thing almost every big studio now has there own in house engine.
 
Not really a bit surprise, but yeah, the PS4 tools are still very early and the 8GB will be siginificant in other sectors than just graphics.

I will probably rock my PC for multiplats and PS4 for exclusives. But you just know there will be badly optimized multiplats that run like crap on PC. (Even with my 680). So the PS4 will be the better choice in some cases.

We'll hopefully see far, far fewer poorly optimised games with the standard PC components both consoles have.
 

KaiserBecks

Member
"Must have been a rushed port...", "most likely an early Devkit", "Unreal engine is not that important, Killzone looks better"...
People who had an utopian impression of their beloved future hardware seem to prefer complete denial over recognizing this for what it actually is.

I don't know why you people would expect the PS4 to automatically destroy a pricey high end (yes, I consider my 680 to be high end, Titan is for enthusiasts who stopped caring about quality/price ratios) GPU, but the fact that it performs close to, or even par with, a 680 is very, very impressive. This shouldn't be about who outperforms whom, the pros and cons on both sides won't change. We should be glad that this kind of quality will be the future standard.
Consoles will build up from there and PCs will benefit from having a much higher denominator. When I saw the first videos of Battlefield 3, I was praying that the future console generation will give developers a GPU that is comparable to a gtx 580.
Now they're getting something that comes close to a gtx 680. Cheer the fuck up.
 

Vaporak

Member
We all knew (even if some refused to accept it) that next-gen console hardware is middling in terms of performance. I just hope Epic keeps SVOGI as an option for the PC version of the engine, although I have no idea if it's feasible to include it as a toggle or if it is easy to flip this on and off at will.

Just FYI it isn't, as far as I am aware you would have to have two sets of texture assets in order to make both options work in the same game. A decently large amount of that can be automated, but not all.
 

KageMaru

Member
I didn't understand why people were so down on the demo in the other thread, I thought it still looked great even though the cut-backs were obvious. No matter what, these games and demos still look a ton better than what we're getting with the PS360 and that's all anyone should be expecting. There's no way we were ever going to get GTX680 results in a $400-$500 console.

Would the ps4 being a know arch will that mean that we will get some quality games from the start but won't be seeing as much difference between games from the start of the generation to the end like it happened this gen? I mean, they can start taking advantage of the arch from day one so...

The arch may be slightly less than what we've seen in the last few generations, but just because the architecture is known, that doesn't mean it'll be tapped out much faster.

Though there is a difference between the upcoming generation and all of the previous generations, past consoles such as the PSone, xbox, and 360 have all demonstrated equal leaps in progression even though they were easier to work with than their competitors.

a known architecture running under Windows' overhead (usually) with drivers overhead, DirectX overhead and software not optimized for some unique GPU+CPU combo but a range of them, I'm pretty sure most of the devs have little experience coding to the metal even if the architecture is known as you said, even if they don't code to the metal learning how to optimize for a certain GPU/CPU/RAM combo instead of coding to support a wide range of them is quite huge and we will see the usual jump between first wave games and sequels.

Coding to the metal is a myth from everything I've read, but you're right that the PS4 won't have the same overhead as a PC using Windows.
 
These "pc extremists" understand what you don't: The high-end card of today is the mid-range card of tomorrow.

yea so Epic might add it when that transition takes place..they kept modifying UE3 over the years. And don't tell me about graphics card brah, I went from a 4850 to a 5850 to a 7970 last gen...I know how quickly these cards get outdated

PC fans shouldn't blame PS4 for the removal of a feature...Epic probably know's the majority of PC gamers don't have the cards that said feature atm, so they will add it later. In other words the min required specs haven't evolved yet, but they will in 2 years or whatever.
 
"pc extremists"...

This is priceless, seriously hilarious. No offence meant, but stop and listen to what exactly you are typing.

People need to take things less personally and perceive everything as an attack upon their own person, or whatever it is they are doing.

It's not healthy identifying so much with your hobby. It's better to walk away if you find it upsets you too much.

The fact is that most PC gamers(even the PC only people) want the consoles as powerful as possible since that only means that much more to build on as the years go by in the PC ports.
 

Stallion Free

Cock Encumbered
whatever helps you guys sleep at night.....i am just glad ue4 wont be used as much this gen. good thing almost every big studio now has there own in house engine.

It's no different than this generation. There are still tons of devs who haven't shown off any engine and most likely will be using the engine. And they will still account for a large percentage of the games released.
 

dsk1210

Member
I see people keep mentioning optimisation, in respect to what the hardware will be able to pull off in a couple of years.

You do realise that optimisation and ways of making things look impressive is what the developers have been doing for years now, I think the developers are now way ahead of the where the actual technology is, we have the methods to produce fantastic visuals represented in real time, but the technology has still to evolve a little bit further until the imagination can be realised.
 

MarkusRJR

Member
The only noticeable issues from the PS3 demo is the aliasing and the pretty inconsistent frame rate. Fix that and I'd be pretty content.
 
"pc extremists"...

This is priceless, seriously hilarious. No offence meant, but stop and listen to what exactly you are typing.

People need to take things less personally and perceive everything as an attack upon their own person, or whatever it is they are doing.

It's not healthy identifying so much with your hobby. It's better to walk away if you find it upsets you too much.

The fact is that most PC gamers(even the PC only people) want the consoles as powerful as possible since that only means that much more to build on as the years go by in the PC ports.

I was laughing when I typed it, I am PC gamer too! I am not just as invested as some of the guys who regularly posts in these type of threads. Been a supporter of Steam ever since HL2 launch ;)
 

The Jason

Member
We shouldn't worry at this point, graphics will just get better and better.

Also most regular people who don't know about the PC demo and everything would think that demo looks absolutely amazing.
 

Ce-Lin

Member
We'll hopefully see far, far fewer poorly optimised games with the standard PC components both consoles have.

don't forget PC games need to run on such a wild range of hardware... also take into account drivers, OpenGL or "to the metal" code to DirectX compatible code porting... it's not as easy as you'd imagine, also that PC UE4 demo was carefully crafted to take advantage of the hardware it was going to run on as a show off (i7 + 680) I'm pretty sure it would crash the very instant you swapped that card with a 78XX GPU or a different CPU.
 
Didn't someone from Epic said they got this running in just a few days? Pretty sure I read that somewhere.


Some of the differences between the demos, like the texture difference, doesn't even make sense. There's no reason for it to look that bad with 8GB.
 
There's your "perfect gaming pc" for ya.

Alpha PS4 devkits.

" ...while the PS4 tools and APIs are still in their initial stages of development - it's going to take time for devs to fully get to grips with the new hardware. Over and above that, assuming this is the same demo that was shown at the PlayStation 4 reveal, we know for a fact that most studios only received final dev kits in the weeks beforehand, the suggestion being that most of the UE4 work will have been produced on unfinished hardware."

...
 
Not a big deal to me since Sony and Microsoft's first-party studios will blow away every other game out there in a year or so, including on PC. Can you imagine Sony Santa Monica's next-gen engine? Naughty Dog? MS' 343 Studios? UE4 being gimped isn't a big deal, especially when so many devs are using their own in-house engines now. It's the first-party exclusives that are going to make these consoles shine the most.
 
It's no different than this generation. There are still tons of devs who haven't shown off any engine and most likely will be using the engine. And they will still account for a large percentage of the games released.

Really? Some of the biggest and best UE3 games were made by EA last gen and now they transitioned over to Frostbite 3.0...

Square Enix = Luminous
Capcom = Panta Reih
Sony = Ice
Crytek = Cry Engine 3(lol)
EA = Frostbite 3.0
Activision = Developing their own engine ATM
Bethesda = Gamebyro lol
Microsoft = well they will get an exclusive ip from Epic, one from crytek, and probably work on an inhouse engine for Halo, Kinect games, Forza
Konami = Fox Engine
Valve = Source 2.0

who is left?
 

Oblivion

Fetishing muscular manly men in skintight hosery
The biggest casualty is the omission of real-time global illumination

Curious, does this mean that the PS4 isn't capable of this feature, or just the UE4?
 

Durante

Member
also that PC UE4 demo was carefully crafted to take advantage of the hardware it was going to run on as a show off (i7 + 680) I'm pretty sure it would crash the very instant you swapped that card with a 78XX GPU or a different CPU.
That's a ridiculous assumption.
 

benny_a

extra source of jiggaflops
Those 8GBs didn't seem to make a difference in this case. (I know Epic may not have access to an 8GB devkit). So many PS fanboys (for lack of a better term) seem to think that the RAM is the component doing all the graphics rendering.
I keep reading about all these PS fanboys that do think that. But then people ravage the one person that actually posts something that is dumb as if they haven't eaten in weeks.

How about we don't generalize groups of people and rather address individual misconceptions so everyone, participating or just reading can learn from these technically minded threads.

Sony = Ice
Ice is not an engine. It's the graphics team.
Sony has a multiplatform engine though: PhyreEngine.

That's a ridiculous assumption.
At first I agreed but then I thought, what about the AMD driver support. ;)
 
The console hasn't even released and people are already making up their minds about certain things (actually, some made up their minds before even the first rumour of specs). I find the whole competitive element to this tedious. I also find it interesting the whole console vs PC gaming discussion. I started gaming on PCs during the floppy disk and amstrad days, probably same as a few here. This childishness of the console vs PC stuff has a reached a new level during the latter half of this gen...Ergo, I seriously doubt this feature will be locked off for high end PC gpus. Or else it will implemented later. For consoles, I will bet they will introduce the tech later down the line with further optimization. UE3 changed a lot during this gen, and the same is likely to happen next gen...
 

Durante

Member
How about we don't generalize groups of people and rather address individual misconceptions so everyone, participating or just reading can learn from these technically minded threads.
Haha "technically minded threads" before a console release. I wish that were the case. I tried my best to explain everything for a few months, but then I realized that 80% of the posters "participating" in these threads don't give a shit about how stuff works, only that their favourite companies' X is larger than the other Xs.

This childishness of the console vs PC stuff has a reached a new level during the latter half of this gen...
Not at all. It reached a new level with the new upcoming consoles. People generally weren't interested in PC comparisons before that (because the result was blindingly obvious).
 

dejan

Member
Oh so you were joking. :)

You got me good, thought you were serious.
Make no mistake, every single one of "them" is serious. And most of "them" will show their ugly hypocritical faces once the inevitable Unreal Engine 4: PS4 vs. Durango comparison thread hits GAF.
 

benny_a

extra source of jiggaflops
Haha "technically minded threads" before a console release. I wish that were the case. I tried my best to explain everything for a few months, but then I realized that 80% of the posters "participating" in these threads don't give a shit about how stuff works, only that their favourite companies' X is larger than the other Xs.
I know it sucks. But the Wikipedia mantra of "Assume good faith" is the best you can do.
I genuinely learned from threads such as these, as I was never much into hardware or graphics technology.
 

KageMaru

Member
Didn't someone from Epic said they got this running in just a few days? Pretty sure I read that somewhere.


Some of the differences between the demos, like the texture difference, doesn't even make sense. There's no reason for it to look that bad with 8GB.

I would be highly surprised if the demo was ported in just a few days, even with the similarities to PC hardware.

Also, just because there's 8GB of memory in these systems, that doesn't mean devs have access to all 8GB for graphics.

Haha "technically minded threads" before a console release. I wish that were the case. I tried my best to explain everything for a few months, but then I realized that 80% of the posters "participating" in these threads don't give a shit about how stuff works, only that their favourite companies' X is larger than the other Xs.

Yeah, I couldn't agree more with this. Tech discussions are almost impossible here when everyone just wants to think, believe, or hope what they want.
 
Sony deveolpers obviously know their stuff better than anyone else, but when I think about it that 8 GB GDDR5 setup does look very weird. I really can't imagine how a game that utilize 7-7.5 GB of VRAM at 1080p is going to looks like, even Rage with 16K textures appearntly just use a little over 2GB of VRAM. Is it even possible for a 1.80 Tflops GPU to push over 7GB+ Vram without starting to fall apart? It will be very interested to see how can take advantage of the PS4 8GB GDDR5 ram.
 

Kimawolf

Member
Early tech demos are unotimized and runnning on devkits, hell even the new Killzone looks better than the UE4 PS4 demo imo

And look at the UE3 demo form last gen
https://www.youtube.com/watch?v=Plnh28ykavQ

Early gen games maybe did not look this good. But GoW3, Uncharted 3, KZ3, Skyrim etc look better imo

Sorry not buying this. I remember reading the Wii U threads and people going on and on about alpha kits etc, and people coming in slamming them. So i doubt it will run significantly better once the final kits are out.
 
Top Bottom