• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Epic: UE4 full feature set requies 1 TFLOP GPU, a scaled down version exists for less

Yeah I vaguely remember something around 500-600 Gflops being a semi consensus in the speculation threads. Then there's been speculation about mystery enhancements (that wouldn't count towards the Gflop count) that no one really has a clue about, like if they even exist, and if they do, how they work.



I guess those enhancement are more fonctionnality based.
 

Alexios

Cores, shaders and BIOS oh my!
The main point that Epic has been making with UE4 is its ease of development and a lot of that hinges on quick iteration which is based on completely realtime lighting, once you go too far down in specs you'll have to turn that off which will remove a big selling point of the engine which is something Epic obviously want to avoid so it isn't just necessarily about wanting to support low end machines (e.g. Wii U) but about whether supporting them hurts the major selling point of the engine.
Can't you iterate with it on during development and then turn it off and compile with baked lighting for the low end platform once you've got the final result you want? It can't cost that much to put an intern in charge of compiling all the already created levels.

Unless I"m getting this wrong (beyond simplifying it for discussion's sake).
 

Minsc

Gold Member
So I wonder what the cheapest DX11 GPU card is that is over 1TFLOP? A 6790 is 1.3 TFLOPs for $120.

Edit: 7770 is the same price, for the same TFLOPs, and a 7750 doesn't break 1 TFLOP, so that may be the entry point for getting over 1 TFLOP.
 

japtor

Member
I think that's the bigger question for Wii-U, whether games built on a 'full-fat' UE4 targeting other next-gen consoles will be easily portable to the 'lite' version running on a whole other class of hardware. Assuming devs use the former as their base target that might be tricky in the general case...I wouldn't be holding my breath.
Well there's another part of it too since it's all UE4 to begin with, if they do want to make a Wii U port (like if it takes off) it should be easier than say the current situation. Like you'd go from UE4 to UE4 lite or whatever, rather than whatever HD engine to needing to work with (or creating) a completely different/custom engine just for the Wii. Even if they have to work on cutting stuff down, it'd be cutting down from the same base rather than needing to recreate everything from scratch.
I guess those enhancement are more fonctionnality based.
If those enhancements are about lighting and other effects then they're a big deal...if Unreal is able to tie into them.
 
Indeed pathetic, how pathetic those who are pathetic enough to have pathetic income and pathetic hardware can enjoy games made by pathetic UE4

lol, I have a 7770 mainly because I don't play enough games to justify 300+ on a single piece of computer hardware at this point in my life and it's pretty good, all things considered. If you're anything under 1080p it runs everything. I don't have a problem getting 60fps at 1080p on most games anyway.
 

Nirolak

Mrgrgr
Can't you iterate with it on during development and then turn it off and compile with baked lighting for the low end platform once you've got the final result you want? It can't cost that much to put an internt in charge of compiling all the already created levels.

It depends on how it works.

In Frostbite 2, you can set Geomerics' lighting engine to pre-compute the lighting during load to whatever the setting you wanted was, and it will do a fairly decent job making it similar. This is what current gen console games do with the engine. However, at that point the lighting becomes static, which depending on your game, may or may not be a problem.

I'm not sure if Epic's SVOGI technique would work like this though, since it isn't really something people have done before to my knowledge. I'm under the impression it isn't generating a lightmap for the entire level a la Lightmass, their pre-rendered lighting solution from UE3, but rather just what is on the screen at the current time.
 
One thing to note is that AMD rate their GFLOPS a lot higher than nvidia do and Epic are using nvidia hardware so you should adjust your GLOPS rating down if you're using an AMD card to obtain the nvidia equivalent.
 

Alexios

Cores, shaders and BIOS oh my!
It depends on how it works.

In Frostbite 2, you can set Geomerics' lighting engine to pre-compute the lighting during load to whatever the setting you wanted was, and it will do a fairly decent job making it similar. This is what current gen console games do with the engine. However, at that point the lighting becomes static, which depending on your game, may or may not be a problem.

I'm not sure if Epic's SVOGI technique would work like this though, since it isn't really something people have done before to my knowledge. I'm under the impression it isn't generating a lightmap for the entire scene a la Lightmass, their pre-rendered lighting solution from UE3, but rather just what is on the screen at the current time.
Would be kind of ironic (vs all the Apple is gonna kill traditional machines talk) if iOS and similar help save a platform like the WiiU by having Epic create something like that and thus allow ports to become possible. I mean, yeah, it could have problems in some games, but probably not as much as the latest COD Wii ports where some levels look completely different, almost without lighting.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
One thing to note is that AMD rate their GFLOPS a lot higher than nvidia do and Epic are using nvidia hardware so you should adjust your GLOPS rating down if you're using an AMD card to obtain the nvidia equivalent.

That's ironic considering that the term "NvFLOPS" was coined long ago to describe Nvidia's spec inflation.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Is EPIC referring to NVIDIA's or AMD's "Tera Flops"?

Their demos are running on a GTX 680. Does that answer your question?
 

Minsc

Gold Member
A Gimped HD5830 will do 1.7 TFLOPs.

Seems they don't really sell these any more, but the 6770 mentioned above can be readily had for $100, $20 less than the 6790, so that's the new UE4 winner, but I'm skeptical it will play UE4 games looking like the tech demo they showed by a long shot.
 

magash

Member
Seems they don't really sell these any more, but the 6770 mentioned above can be readily had for $100, $20 less than the 6790, so that's the new UE4 winner, but I'm skeptical it will play UE4 games looking like the tech demo they showed by a long shot.

Highly unlikely mainly because the GPU used to run the demo was a high end card that costs at least $499.99
 
Seems they don't really sell these any more, but the 6770 mentioned above can be readily had for $100, $20 less than the 6790, so that's the new UE4 winner, but I'm skeptical it will play UE4 games looking like the tech demo they showed by a long shot.



It could. But not like this demo was running, like I think it was 4k resolution + 60FPS + developpement tools.



Highly unlikely mainly because the GPU used to run the demo was a high end card that costs at least $499.99


But we don't know under which conditions it was running.
I'm sure a 1Tflops GPU can run it at least at 720p and 30fps, if not more.
 
I think that's the bigger question for Wii-U, whether games built on a 'full-fat' UE4 targeting other next-gen consoles will be easily portable to the 'lite' version running on a whole other class of hardware. Assuming devs use the former as their base target that might be tricky in the general case...I wouldn't be holding my breath.

if UE4 lite supports baking lightmaps, it might not be too tricky at all, so long as the game doesn't depend on real time lighting in its gameplay... but really, we need to know more. clearly what was shown last night is beyond the Wii U. if UE4 lite is basically UE3.5+ it doesn't change anything we were already thinking about the likely situation when it comes to multiplatform games and UE4.
 

Instro

Member
Yeah I vaguely remember something around 500-600 Gflops being a semi consensus in the speculation threads. Then there's been speculation about mystery enhancements (that wouldn't count towards the Gflop count) that no one really has a clue about, like if they even exist, and if they do, how they work.

It seems like the extra features exist, but they could be anything as you mentioned. The issue is how long it will take developers to figure out those features and implement them. I recall there being mentions of the system breaking some middleware/engines a few months ago.

It would be rather weird if the system ends up being some combo low raw performance when it comes to polygons(vs PS4/720) but keeping up to some degree in the realm of lighting or particles or various other effects.
 

japtor

Member
According to Wikipedia my old-ass HD5850 has 2088 GFLOPS.
That can't be right, can it?
No clue about specific video cards, but you can have the power but not necessarily the feature set. Like you could have a billion Tflops but not be able to run UE4 if it's not capable of DX11 effects or whatever.
It seems like the extra features exist, but they could be anything as you mentioned. The issue is how long it will take developers to figure out those features and implement them. I recall there being mentions of the system breaking some middleware/engines a few months ago.

It would be rather weird if the system ends up being some combo low raw performance when it comes to polygons(vs PS4/720) but keeping up to some degree in the realm of lighting or particles or various other effects.
Wouldn't be a horrible compromise I think, you can see how big a difference effects make to the same geometry in a lot of games these days. Like Wii or 3DS games look like ass without them...but toss a few on and they can look decent.
 

lefantome

Member
If even the iPhone can have a version of UE4, why the hell not the Wii U?

because Iphone will use a different renderer and technology, you will use probably the same tools of UE4 and some code will be the same, but you will get something similar to UE3.

From a console point of view, Wii u won't get the same features as the next gen, so it can use the UE3. Probably Epic doesn't condider the investment on wii u profitable because they can use the last version of UE3.
 

FrankT

Member
I tell ya what it feels good getting back to PC gaming right now after 6-7 years. Can get a whole lot for the money at the moment. Bring on UE4.


Edit: I should say though all this power under the hood really makes the rumored specs of these next-gen consoles already feel dated. Doubt I'm day one anything next year.
 
So... just did a quick Newegg wishlist. Core i7 Ivy Bridge with high end motherboard, GTX680, 16GB of DDR3, and a 120GB Intel SSD comes out just under $1500. Not bad!
 

Barryman

Member
I'm not sure if Epic's SVOGI technique would work like this though, since it isn't really something people have done before to my knowledge. I'm under the impression it isn't generating a lightmap for the entire level a la Lightmass, their pre-rendered lighting solution from UE3, but rather just what is on the screen at the current time.

I think you're right, BUT conceptually I still think it should be possible. You could design the lit scene in real-time using the full-featured UE4 editor while maintaining the benefits that gives you: instant feedback from tweaking parameters and so on. Then, once you were happy with the result, you could use Lightmass (or an equivalent) to bake a light map using the exact same parameters for all of the emitters and materials.

So even though you'd still be creating light maps for the scaled-down version of UE4, you would only have to do it once, rather than once per tiny tweak, and would still end up with virtually the same time savings as the full scale UE4 would give you, just a less dynamic result.
 
I think you're right, BUT conceptually I still think it should be possible. You could design the lit scene in real-time using the full-featured UE4 editor while maintaining the benefits that gives you: instant feedback from tweaking parameters and so on. Then, once you were happy with the result, you could use Lightmass (or an equivalent) to bake a light map using the exact same parameters for all of the emitters and materials.

So even though you'd still be creating light maps for the scaled-down version of UE4, you would only have to do it once, rather than once per tiny tweak, and would still end up with virtually the same time savings as the full scale UE4 would give you, just a less dynamic result.

right. but today i heard Epic say in that developer walk through that you *can't* bake lightmaps in UE4. maybe that's just the full fat version, maybe it's not. until today i'd presumed they would allow such a thing.
 
right. but today i heard Epic say in that developer walk through that you *can't* bake lightmaps in UE4. maybe that's just the full fat version, maybe it's not. until today i'd presumed they would allow such a thing.

If that were literally true, would the low-end version produce worse results than UE3?
 

Kayhan

Member
Does that mean that the next-gen consoles will have more than 1 TFLOPS of performance or are they going to have to use the 'tard pack version of UE4?
 

AlStrong

Member
That said, the real question is at what point global illumination starts running on the engine, since once that turns off, you lose the benefit of fully realtime development.

I don't quite understand what you mean by losing realtime development in conjunction with their GI. If you turn off GI, the renderer is still running deferred shading. You'd basically end up in the same situation as Crysis 2 on PS360.
 

Barryman

Member
Wouldn't you lose indirect lighting if you turn off GI AKA a big point of UE4?

Yeah, but as I described in my post above, I can't see a reason why it wouldn't be possible to work with real-time GI in the editor and then ship with pre-rendered GI for UE4 lite.
 

AlStrong

Member
Wouldn't you lose indirect lighting if you turn off GI AKA a big point of UE4?

It's just a rendering feature. The main point about UE4 should be the toolsets and their capabilities. DX11 is the main graphics point, but GI isn't some mandate there.

CryEngine 3's LPVs were a big discussion point for consoles, yet they shipped Crysis 2 without it there.

Enlighten GI was a big discussion point for DICE, yet they shipped BF3 with it disabled (all-platforms).
 

Nirolak

Mrgrgr
I don't quite understand what you mean by losing realtime development in conjunction with their GI. If you turn off GI, the renderer is still running deferred shading. You'd basically end up in the same situation as Crysis 2 on PS360.

Does it still look acceptable with their implementation?

I mean, the main concept behind pre-rendered lighting is quality as opposed to not being able to do some form of realtime lighting on current gen.
 
It's just a rendering feature. The main point about UE4 should be the toolsets and their capabilities. DX11 is the main graphics point, but GI isn't some mandate there.

CryEngine 3's LPVs were a big discussion point for consoles, yet they shipped Crysis 2 without it there.

Enlighten GI was a big discussion point for DICE, yet they shipped BF3 with it disabled (all-platforms).

I think it's a bit more important than 'just some rendering feature'. In the dev demo the guy turned off indirect lighting in a room and it lost most of its lighting/flashiness, I just can't imagine most devs including Epic being happy with turning GI off when it makes such a huge difference to the quality of the visuals. Based on that video example, would you not agree that GI probably makes more of a difference to the quality of the graphics than any other introduced feature? Even if turning it off doesn't impact the ease of work witht he engine, I just can't see anyone being happy with its removal.
 

AlStrong

Member
I mean, the main concept behind pre-rendered lighting is quality as opposed to not being able to do some form of realtime lighting on current gen.

I think the main thing for Epic is that up until they switched to deferred shading (Samaritan), they just couldn't do it realtime (with decent performance). Now they have an engine that was designed for rendering large numbers of dynamic lights rather than just having it hacked into an existing one that has a number of features designed around baked lighting.

The GI is just icing on the cake.
 

i-Lo

Member
And now someone says that there are different ways of measuring flops. *sigh* Could this be simplified? To that end, what's the exchange rate these days?

As in: nVidia's 1 Teraflop = AMD's ? Teraflop
 

GameSeeker

Member
Remember TeraFlops are made up marketing numbers and are completely useless.

Epic claims you need a 1 TFLOP CPU for UE4.

If you go back and read the original PS3 press release on May 16, 2006, the Sony spec sheet claims the RSX could perform 1.8TFLOPS (I'm not making this up, go look at the press release).

So UE4 should run great on a PS3!

WRONG!

Both numbers are completely bogus and Epic, Sony (and NVidia, AMD) are trying to insult our collective GAF intelligence. Don't believe them.
 

Thraktor

Member
Interesting that they mention 1TF ;)

Hinting at the Wii U, are we?

Edit: To be a bit more constructive, while people are commenting on whether less-powerful hardware will have to use pre-baked lightmaps, I don't think that's the case. From the limited understanding of the SVOGI technique I can gather from their brief description, the strategy will probably be (among other things) to reduce the resolution of the voxelization, and to reduce the number of cone traces per pixel (which would yield a less accurate approximation of the second-bounce illumination). Reducing the number of light sources should also improve performance.
 
Top Bottom