Horse Armour
Member
How can a non existent console be blamed?Why is the PS4 getting all the blame for the removal of SVOGI and the overall degradation? I'm sure the rumored 1.2TF GPU in the Durango is going to match 1-to-1 with the PC demo.
How can a non existent console be blamed?Why is the PS4 getting all the blame for the removal of SVOGI and the overall degradation? I'm sure the rumored 1.2TF GPU in the Durango is going to match 1-to-1 with the PC demo.
I have published papers at international conferences dealing with performance analysis on GPUs. But maybe the peer reviewers failed to recognize my total cluelessness.You have no idea what you're going on about, do you?
No, they get them from how the hardware is designed. If one ALU is designed to retire N floating point operations per cycle, and there are M of them, then you get N*M theoretical FLOPs per cycle. It's not rocket science.Where do you think they get the initial flop ratings for their alus? Out of thin air?
Please enlighten me as to the impact of this thread/process level overhead. I currently have a student investigating the overhead of thread and process level synchronization primitives on Windows, I'm sure he'd be thrilled to compare notes with you.And windows overhead is at the thread/process level. You are bordering on cluelessness.
Idk how many times these interviews have been posted but thank you for posting them again. I cringe every time someone writes coding to the metal off as a mere myth.well really you dont "code to the metal" it just a lot lower API in console. Coding on low level api and one spec together really make all the difference.
http://www.bit-tech.net/hardware/graphics/2011/03/16/farewell-to-directx/ <-really the whole article is a good read
But John Carmack is really the one for years talk about how bad API in PC are...
http://www.pcper.com/reviews/Editor...-Graphics-Ray-Tracing-Voxels-and-more/Intervi
Just think about in PC gaming just a driver update can give you 30% performance improve in a game.
The 680 is more double the performance of the gpu in the ps4. Epic engine is also DX based while the ps4 api is not. I wouldnt be shock to see this demo run better on x720 hardware just base on that. Until epic has more time to port this engine over. Like epic said their engine are ready until they release a game on that platform.
There is emotional investment on both sides
I'd really like to see what you've read.
Thats patently and fundamently false. If its been mentioned so much, I'd love to see some posts to challenge. Im gonna just go with very few people here and elsewhere know exactly what "coding to the metal" is and even fewer understand the implications on performance.
Why is the PS4 getting all the blame for the removal of SVOGI and the overall degradation? I'm sure the rumored 1.2TF GPU in the Durango is going to match 1-to-1 with the PC demo.
It's not like UE4 is the only thing happening on PS4. Doesn't Enlighten Engine support dynamic radiosity on PS4?
PlayStation®4 promises a dramatic increase in memory and compute resource, which will free developers to unleash the full creative power of dynamic lighting and to finally extend cinematographic film practices to dynamic immersive worlds.
Sure it also offers the option to go for prebaked lightmaps (or a combo), but I definitely believe we will see some kind of realtime GI on PS4 (most likely in a form of prebaked/dynamic-in-certain-areas combo).
I have published papers at international conferences dealing with performance analysis on GPUs. But maybe the peer reviewers failed to recognize my total cluelessness.
No, they get them from how the hardware is designed. If one ALU is designed to retire N floating point operations per cycle, and there are M of them, then you get N*M theoretical FLOPs per cycle. It's not rocket science.
Please enlighten me as to the impact of this thread/process level overhead. I currently have a student investigating the overhead of thread and process level synchronization primitives on Windows, I'm sure he'd be thrilled to compare notes with you.
kagemaru said:Funny thing is the bolded applies to you very well. More often than not, actual coding to the metal is more time consuming than the performance gain is worth. Otherwise we would see entire PS360 games written entirely in assembly instead of DX or LibGCM on the 360 and PS3 respectively (not that that would really be feasible anyways).
You know Durante is the guy who fixed Dark Souls for the entire planet right?Maybe? IDK. You've proven nothing of your knowledge to me, so my assumptions are what they are *shrugs*
You know Durante is the guy who fixed Dark Souls for the entire planet right?
Also, Bethesda will use id Tech 5 for his future titles.
The engine is well documented and many devs were trained it on it already.I have no idea why Epic's engines are so popular.
The price they ask for its use on an AAA game is astronomical, unless of course they have deals for publishers outside there standard pricing, which may be the case.
For those that don't know, they want 25% of everything you make over 50k.
On a game that sells 3 or 4 million copies, that is a HUGE amount of money.
On topic, I'm surpsrised it stacks up against it so well, not disapointed that it doesn't meet an expensive PC in performance.
Do the PC gamers who claim superiority over consoles *due to graphics* realize that PC's and consoles don't share the same software libraries?
I haven't did anything cool, but yeah it means nothing to me.
Once durango specs are confirmed it can share the blame.
The engine is well documented and many devs were trained it on it already.
And you are looking at their Indies licensing deal not the actual publisher level one.
They had no choice. The PS3 is/was a development nightmare. Epic games and 343 did awesome work on the 360 late in it's life as well.Yeah, no. And as far as I know, only Sony's first party has did any meaningful work "to the metal" this generation. And they released alot of their work to share.
ole gamers would expect, and doesn't quite tally when other developers are talking about PS4 out-powering most PCs for years to come. But it is important to put all of this into context. The DirectX 11 API is very mature while the PS4 tools and APIs are still in their initial stages of development - it's going to take time for devs to fully get to grips with the new hardware
There are other ways to get global illumination in games. And there's nothing stopping Epic from releasing new updates to the engine like they always have done.I'm fairly certain that UE4 engine will be used by even more devs for next gen than UE3 was/is used for current gen.
It is the de facto 3rd party game engine in the industry. If it doesn't have Global Illumination enabled for performance reasones, more than 50% of the games won't have it.
No-one says it is impossible, but maybe it is just impractical.
Yeah, no. And as far as I know, only Sony's first party has did any meaningful work "to the metal" this generation. And they released alot of their work to share.
If you think they have to go through a "testing regime" for the ALUs by "feeding MADDs" to it once they have an entire GPU on their hands to see if it can "do these three flops within the clock at an acceptable rate" then I really have nothing more to say to you.Of course its how the hardware is designed. Again, it doesn't come out of the thin air. If you design an alu to perform , say a multiply-add ,do you think they just take a leap of faith and say "welp its designed that way, it should work, next" Or do you think they go thru an extensive testing regime to make sure this alu can do these three flops within the clock at an acceptable rate?
So can you point out any particular deficiencies regarding threads in Windows that lead to the massive overheads you describe and which are not easily circumvented at the application level in a game setting? Or does your knowledge only extend to throwing around buzzwords?Send him my way . We could discuss thread level parallesim and some of windows pitfalls in that area, scheduling deficiencies, or execution delays, or overall thread management. There's a whole world to explore.
Would developers be able to turn all these fancy effects on for the PC ports of UE4 games?
Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?
Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?
Not really, no. There's a significant difference in asset creation between using a fully realtime lighting model and one with prebaked lightmaps.Quoting myself because I'm really curious. If someone makes a ps720 game with UE4 could the developer turn on all the fancy effects easily for PC or doesn't that make sense?
It seems like SVOGI is out of the engine completely as of right now. Other effects will still be available on PC.
So does this mean we'll have to wait another console cycle t take advantage of technology that was developed years ago?
You have no idea what you're going on about, do you?
Where do you think they get the initial flop ratings for their alus? Out of thin air?
And windows overhead is at the thread/process level. You are bordering on cluelessness.
Not really, no. There's a significant difference in asset creation between using a fully realtime lighting model and one with prebaked lightmaps.
Bwahaha.
You couldn't have picked a worst person to say that to.
Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.You heard it here first guys, that PC optimisation brings pc ahead of consoles.
As usual from digital foundry this entire article reads like a whole lot of pandering to the console only crowd.
Can't wait to see them test all their games on a 7850 on pc 4 years into next gen to have a 'fair' comparison, while the rest of us are playing at 120hz 1600p on gpus 6-10x better.
Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.
What I really dislike about it is that discussing the relative "knowledge" of posters is completely meaningless as to the actual argument. But when someone doesn't really present any argument and only attacks a post by doubting your knowledge of the subject matter it's hard to make a factual counterpoint.Was thinking the same thing.
Yeah, I also think that DF evaluations are generally very fair. Though I can't agree with their choice of PC hardware for cross-platform evaluations, but that doesn't factor into the equation here.Really? As a developer, this didn't sound like pandering at all. DirectX 11 spec has been solidified for quite a while now while the PS4 graphics library is still in flux. SDKs continually get updates all the time and new functions are added, removed, deprecated, and improved. SDK tools are always in development and get new features to better help developers create more efficient engines. You may mock, but the reality is that this is actually quite true.
What I really dislike about it is that discussing the relative "knowledge" of posters is completely meaningless as to the actual argument. But when someone doesn't really present any argument and only attacks a post by doubting your knowledge of the subject matter it's hard to make a factual counterpoint.
You have no idea what you're going on about, do you?
Where do you think they get the initial flop ratings for their alus? Out of thin air?
And windows overhead is at the thread/process level. You are bordering on cluelessness.
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.Did you seriously just try to tell Durante he doesn't know what he's talking about....
Fucking lol.
I wonder how many of the console gamers in this thread actually know what SVOGI stands for.
Without console you still wouldn't have your true dynamic lighting in UE4 either.
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.
Still loving how certain PC extremists are blaming consoles...someone mentioned SOVGI was taking more than half the resources of GTX 680, it isn't practical for the market this year. Medium and lower end cards can't run it...why continue to blame PS4? I mean it isn't realistic for everyone to have a GTX 680 card in their PC today...I am sure 2 years down the line Epic will enable the feature for PC gamerz
They HAVE to keep the minimum required specs in mind for PC games.
edit at how UE3 was upgraded over the years. Especially in the lighting department
To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.
Still loving how certain PC extremists are blaming consoles...someone mentioned SOVGI was taking more than half the resources of GTX 680, it isn't practical for the market this year. Medium and lower end cards can't run it...why continue to blame PS4? I mean it isn't realistic for everyone to have a GTX 680 card in their PC today...I am sure 2 years down the line Epic will enable the feature for PC gamerz
They HAVE to keep the minimum required specs in mind for PC games.
edit: look at how UE3 was upgraded over the years. Especially in the lighting department
Hey, someone finally gets it!To be honest, who cares? At the end of the day 8GB of GDDR5 RAM will ensure that PC doesn't catch up to consoles for years.