• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Epic: UE4 full feature set requies 1 TFLOP GPU, a scaled down version exists for less

Nirolak

Mrgrgr
Epic has finally shared the GPU requirements to run the full feature set of Unreal Engine 4, but revealed they have a version with features turned off for platforms that can't reach that.

NVidia said:
NVidia: What are the key design goals for Unreal Engine 4?

Time Sweeney: We have three big goals:

-First, to define the next generation of graphics capabilities that are achievable with DirectX 11 PC’s and future consoles.
-Second, to deliver a toolset with an unprecedented mixture of power, ease-of-use, and polish.
-And finally, to scale the technology and its feature set up and down the spectrum of future computing devices, including iOS and Android, and mainstream PC.

NVidia: What is the target platform for UE4? What kind of hardware are gamers going to need to run UE4 based games?

Tim Sweeny: Unreal Engine 4’s next-generation renderer targets DirectX 11 GPU’s and really starts to become interesting on hardware with 1+ TFLOPS of graphics performance, where it delivers some truly unprecedented capabilities. However, UE4 also includes a mainstream renderer targeting mass-market devices with a feature set that is appropriate there.
Source: http://www.geforce.com/whats-new/ar...-next-gen-gtx-680-powered-real-time-graphics/
 

Alexios

Cores, shaders and BIOS oh my!
Neat. I hope PC gamers playing games using it can choose to disable features and get something more like the lighter weight renderer but with the same content, for people who don't buy top of the line stuff, without necessarily making it look like ass, just giving up on some of the extra pretty eye candy like fully dynamic lighting and tessellation. Without requiring the developer to basically offer a different version of the game with that renderer or something.
 
Some people might not like to read these kind of posts, but i admit my ignorance. Things were much easier when it was 8 bit, 16 bit, 32 bit.

When we got off that measure, it threw things off for me. It became harder and harder to differenciate performance if you are not a tech wizard.

I have a notebook with a GTX 560 and i have no idea how far away i am from UE4.
 
Wait, so 1TFlop is required for the full feature set, but there is a version that requires less power ?

So why can't Wii-U have it ? What's the point of the lower "mainstream renderer targeting mass market devices" ? Do they mean iphones and hand helds ?
 

thuway

Member
Ugh, next generation consoles should be at the very least 2 TFLOPs. As a businessman he's probably quoting the most base-line specs for machines. I want the full feature set, 1080p, and all the bells and whistles.
 

japtor

Member
Neat. I hope PC gamers playing games using it can choose to disable features and get something more like the lighter weight renderer but with the same content, for people who don't buy top of the line stuff, without necessarily making it look like ass, just giving up on some eye candy like fully dynamic lighting and tessellation. Without requiring the developer to basically offer a different version of the game with that renderer or something.
I'm guessing that's what he means by "mainstream PCs" since it'd be bad (as far as userbase) to target just higher end stuff.
 
Wait, so 1TFlop is required for the full feature set, but there is a version that requires less power ?

So why can't Wii-U have it ? What's the point of the lower "mainstream render targeting mass market devices" ? Do they mean iphones and hand helds ?




They meant less-powerfull-than-Wii-U iPhones, iPads, Android devices.
 

Alexios

Cores, shaders and BIOS oh my!
So why can't Wii-U have it ? What's the point of the lower "mainstream renderer targeting mass market devices" ? Do they mean iphones and hand helds ?
Who says it can't? Maybe Epic can't bother porting it because no devs ask for it. Maybe Nintendo isn't willing to pay for it. Maybe they're focusing on other versions first. UE3 didn't launch with iOS support either. Maybe it'll get it at some point. Maybe not.
 

lefantome

Member
Regarding the 16GB of Ram used in the development pc:

-The demo ran inside the editor with advanced feature:debug etc.. It's normal it needs more ram. Do you know that usually devkits have more ram?(double ad es.)
-They said they are not using baked lightning, but their current algorithm is ram expensive.
-We won't see UE4 game probably before late 2014, they have enough time to improve performances.
-Running a game on a console is different than doing it on a pc. console games works with a total of 512mb of ram. They can scale the performance only on one hw.
 
Pathetic. I thought this was meant to be some next gen exclusive high end stuff, 1TFLOP is basically nothing, even a pathetic 7770 has 1.3TFLOPS.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
Can a reasonable unbiased poster familiar with tech comment on implications for Wii U?
 
Who says it can't? Maybe Epic can't bother porting it. Maybe Nintendo isn't willing to pay for it. Maybe they're focusing on other versions first. UE3 didn't launch with iOS support either. Maybe it'll get it at some point. Maybe not.

I thought Epic said so. But I don't remember the exact language. I'm trying to understand if they don't want to do UE4 on Wii-U or if it's impossible. But I agree, if they can do it on mobile phones, I don't know why they couldn't do it on Wii-U. Someone in another thread had just told me UE4 has real time lighting that simply requires a strong GPU, the new lighting can't be disabled.
 

tkscz

Member
Pathetic. I thought this was meant to be some next gen exclusive high end stuff, 1TFLOP is basically nothing, even a pathetic 7770 has 1.3TFLOPS.

This is the type of gamer you don't want to be.

I thought Epic said so. But I don't remember the exact language. I'm trying to understand if they don't want to do UE4 on Wii-U or if it's impossible. But I agree, if they can do it on mobile phones, I don't know why they couldn't do it on Wii-U.

Epic never said that. They just kept saying High-end UE3 would better suit it and kept talking about how UE3 could look better than the Zelda demo.
 
I thought Epic said so. But I don't remember the exact language. I'm trying to understand if they don't want to do UE4 on Wii-U or if it's impossible. But I agree, if they can do it on mobile phones, I don't know why they couldn't do it on Wii-U.

They never said it wasn't possible. If enough licensees want it on Wii-U, they'll port it.
 

Nirolak

Mrgrgr
Wait, so 1TFlop is required for the full feature set, but there is a version that requires less power ?

So why can't Wii-U have it ? What's the point of the lower "mainstream renderer targeting mass market devices" ? Do they mean iphones and hand helds ?

Well, in the question above, they state this:
"-And finally, to scale the technology and its feature set up and down the spectrum of future computing devices, including iOS and Android, and mainstream PC."

Can a reasonable unbiased poster familiar with tech comment on implications for Wii U?

If iOS devices can run a scaled down version, the Wii U should be able to.

That said, the real question is at what point global illumination starts running on the engine, since once that turns off, you lose the benefit of fully realtime development.

But yeah, I see no reason Epic couldn't make the scaled down version run on Wii U if they want to.
 
Neat. I hope PC gamers playing games using it can choose to disable features and get something more like the lighter weight renderer but with the same content, for people who don't buy top of the line stuff, without necessarily making it look like ass, just giving up on some of the extra pretty eye candy like fully dynamic lighting and tessellation. Without requiring the developer to basically offer a different version of the game with that renderer or something.

You've been able to get a 1 teraflop GPU for under $200 for 4 years now (4850) and AMD integrated graphics should be passing that threshold within the year. There's just no need to target a lower specification on PC, I would surmise its there more for mobile and the Wii U.
 
Pathetic. I thought this was meant to be some next gen exclusive high end stuff, 1TFLOP is basically nothing, even a pathetic 7770 has 1.3TFLOPS.
Indeed pathetic, how pathetic those who are pathetic enough to have pathetic income and pathetic hardware can enjoy games made by pathetic UE4
 

Sciz

Member
I'm sure Wii U won't get this, even with this "scaled down version".

The 4850 was pushing a teraflop four years ago, and as far as we know, that's more or less what the Wii U has (unless I'm behind the times on my rumors). There'd be no reason not to support it, unless there ends up being some massive OpenGL incompatibility, which strikes me as unlikely.
 

donny2112

Member
Someone in another thread had just told me UE4 has real time lighting that simply requires a strong GPU, the new lighting can't be disabled.

The fact that the mainstream version of UE4 can disable features for weaker hardware makes that statement you're referring to seem a little short-sighted.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
If iOS devices can run a scaled down version, the Wii U should be able to.

That said, the real question is at what point global illumination starts running on the engine, since once that turns off, you lose the benefit of fully realtime development.

But yeah, I see no reason Epic couldn't make the scaled down version run on Wii U if they want to.


Thanks. One more question- does it seem like the more realistic Wii U version would be the scaled down version, or trying to make the full feature set one work?
 

Nirolak

Mrgrgr
I didn't want to scare people away by putting this in the OP, but their approach to Global Illumination is pretty interested:

NVidia said:
NVidia: Please give us an overview of how the algorithm works from generating the octree, to cone tracing, to the gathering pass.

Tim Sweeney: The technique is known as SVOGI – Sparse Voxel Octree Global Illumination, and was developed by Andrew Scheidecker at Epic. UE4 maintains a real-time octree data structure encoding a multi-resolution record of all of the visible direct light emitters in the scene, which are represented as directionally-colored voxels. That octree is maintained by voxelizing any parts of the scene that change, and using a traditional set of Direct Lighting techniques, such as shadow buffers, to capture first-bounce lighting.

Performing a cone-trace through this octree data structure (given a starting point, direction, and angle) yields an approximation of the light incident along that path.

The trick is to make cone-tracing fast enough, via GPU acceleration, that we can do it once or more per-pixel in real-time. Performing six wide cone-traces per pixel (one for each cardinal direction) yields an approximation of second-bounce indirect lighting. Performing a narrower cone-trace in the direction of specular reflection enables metallic reflections, in which the entire scene is reflected off each glossy surface.
 

BY2K

Membero Americo
So the Wii U will be able to technically run a scaled down version of UE4.

Good to know.
 

japtor

Member
Can a reasonable unbiased poster familiar with tech comment on implications for Wii U?
The way I view it is that UE4 itself is scalable (I really don't understand why that was ever in question) and has a decent chance of being on Wii U, but whatever particular game may or may not be as scalable.
 

Alexios

Cores, shaders and BIOS oh my!
You've been able to get a 1 teraflop GPU for under $200 for 4 years now (4850) and AMD integrated graphics should be passing that threshold within the year. There's just no need to target a lower specification on PC, I would surmise its there more for mobile and the Wii U.
Ah, cool, I have no idea about that stuff really, I just buy a mid range card every few years based on real world performance, my last was a GTX285 which probably isn't on par. Then again UE4 isn't out yet, maybe I'll be in for my new build by then.
 
The 4850 was pushing a teraflop four years ago, and as far as we know, that's more or less what the Wii U has (unless I'm behind the times on my rumors). There'd be no reason not to support it.

Based on the rumours, it's nowhere near a 4850, even something anemic like a 4670 would be a bit of a stretch based on the lastest information from Ubisoft.
 
The 4850 was pushing a teraflop four years ago, and as far as we know, that's more or less what the Wii U has (unless I'm behind the times on my rumors). There'd be no reason not to support it, unless there ends up being some massive OpenGL incompatibility, which strikes me as unlikely.



From what I heard, Wii U has a nearly 0.5Tflops GPU.
 

Nirolak

Mrgrgr
Thanks. One more question- does it seem like the more realistic Wii U version would be the scaled down version, or trying to make the full feature set one work?

To be honest, we would really need technical specs to know that. The full feature set engnie basically requires DirectX 11 grade hardware and 1 TFLOP of performance.

Based on the leaks from that thread, the DX11 part might be true, though it's not 100% clear.

If we assume Ubisoft is being truthful though, that would be 250 GFLOPS from Xenos * 1.5 = 375 GFLOPs. It's not a slouch by any means, but that is kind of far from the 1 TFLOPS line.

If Ubisoft was understating and the real thing is closer to like 800+ GFLOPS, I feel attempting a full port would be much more reasonable.
 
Ah, cool, I have no idea about that stuff really, I just buy a mid range card every few years, my last was a GTX285 which probably isn't on par. Then again UE4 isn't out yet, maybe I'll be in for my new build by then.

A GTX 285 was the highest end card of its time, not exactly midrange.
 

schuelma

Wastes hours checking old Famitsu software data, but that's why we love him.
The way I view it is that UE4 itself is scalable (I really don't understand why that was ever in question) and has a decent chance of being on Wii U, but whatever particular game may or may not be as scalable.

Yeah that is kind of how I read it as well.
 

monome

Member
When it comes to mobile UE4 he s talking stuff that he thinks will be better than WiiU in just 2/3 years from now.
No UE4 for the next 3 iphones.

Yet if WiiU is full directx 11 capable, it would ba a shame.
 

gofreak

GAF's Bob Woodward
The way I view it is that UE4 itself is scalable (I really don't understand why that was ever in question) and has a decent chance of being on Wii U, but whatever particular game may or may not be as scalable.

I think that's the bigger question for Wii-U, whether games built on a 'full-fat' UE4 targeting other next-gen consoles will be easily portable to the 'lite' version running on a whole other class of hardware. Assuming devs use the former as their base target that might be tricky in the general case...I wouldn't be holding my breath.
 

japtor

Member
From what I heard, Wii U has a nearly 0.5Tflops GPU.
Yeah I vaguely remember something around 500-600 Gflops being a semi consensus in the speculation threads. Then there's been speculation about mystery enhancements (that wouldn't count towards the Gflop count) that no one really has a clue about, like if they even exist, and if they do, how they work.
 
The way I view it is that UE4 itself is scalable (I really don't understand why that was ever in question) and has a decent chance of being on Wii U, but whatever particular game may or may not be as scalable.

The main point that Epic has been making with UE4 is its ease of development and a lot of that hinges on quick iteration which is based on completely realtime lighting, once you go too far down in specs you'll have to turn that off which will remove a big selling point of the engine which is something Epic obviously want to avoid so it isn't just necessarily about wanting to support low end machines (e.g. Wii U) but about whether supporting them hurts the major selling point of the engine.
 
Top Bottom