It is probably very common, it's just that this time you can see a before and after that highlights it.
And frankly they didn't do a very good job.
It is probably very common, it's just that this time you can see a before and after that highlights it.
Seriously this needs to stop. 14+4 split was debunked.
PlayStation 4 is currently around 50 per cent faster than its rival Xbox One. Multiple high-level game development sources have described the difference in performance between the consoles as “significant” and “obvious.”
Our contacts have told us that memory reads on PS4 are 40-50 per cent quicker than Xbox One, and its ALU (Arithmetic Logic Unit) is around 50 per cent faster. One basic example we were given suggested that without optimisation for either console, a platform-agnostic development build can run at around 30FPS in 1920×1080 on PS4, but it’ll run at “20-something” FPS in 1600×900 on Xbox One. “Xbox One is weaker and it’s a pain to use its ESRAM,” concluded one developer.
Microsoft is aware of the problem and, having recently upped the clock speed of Xbox One, is working hard to close the gap on PS4, though one developer we spoke to downplayed the move. “The clock speed update is not significant, it does not change things that much,” he said. “Of course, something is better than nothing.”
What it does not cover is what happens one these builds go through massage to use each consoles specific strengths. What happens once the dedicated hardware in the Xbox is put to use, what happens when the developer actually things about their render pipeline and starts to use the ESRAM etc. Likewise when developers start to experiment with the more abundant GPU on the PS4 for extra GPGPU.
At the moment developers are having a hard time, they have had poor drivers, non final spec's and they are targeting more consoles than every as they also often have 360 and PS3 ports on the go also.
yes it would appear the Xbox takes slightly more work than porting a PC build to it but as the console game is all about refinement and optimization this is not a bad thing as developers will be doing this anyway.
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.
The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.
There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.
The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.
There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
So now we're moving from 14+4 to 10+8. Goodie.
This fellow explained it nicely. You don't dedicate entire CUs to GPGPU; It's all scheduled to run by the PS4 when there's resources available. EDIT: Or the guy above me can find a quote that explains it in further detail, heh.
Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.
The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.
There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
There's not split. The VGLeaks 14+4 was debunked as not true. There are just 18 CU.
I think you're confusing the 8 ACEs in the PS4, which are used for GPGPU. GPGPU works in tandem with typical GPU rendering. It's not one or the other. GPGPU compute task are fed to the GPU when it has spare cycles.
W!CKED did a good summary around how GPGPU in the PS4 works in another thread. The Xbox One will work in the same way, but far more limited. Not only does the One have fewer CUs, it only has 2 x ACE and 2 x Compute Queues (compared to 8 x ACE and 64 x compute queue in the PS4).
Sounds like anonymous dev needs to brush'n up his esram skills.
Dev(s) as in several of them. And no it won't be easy getting parity with PS4. There's a lot of redundancy in using 32 MB as a frame buffer which will eat away at the high BW.
As for the DDR3; only 68 GB/s; minus OS needs; minus CPU needs = not much for textures.
The OS doesn't consume bandwidth like that.
The first hand reports from the Toronto FanExpo in late August noted terrible jaggies in KI. They may not have been calling the res right there, but people were noticing it.
It was about a week after this show that it was confirmed as 720p
Confirmed 1080p native.So is Driveclub 720p as well then?
The OS doesn't consume bandwidth like that.
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...
Any game developed wont be held back be xbox being weaker then.
Ignored this thread mostly because I thought I could guess how stupid it could become. Then, out of curiosity, I poke my head in and find dudes blowing up a single frame of a pick to prove it was Photoshoped to hide clipping on an Xbone game. I don't know if it was or wasn't, if I had to guess I'd say it was, either way it doesn't make a lick of difference about anything at all. That being said, I know someone believes there's some scandal or winning going on here so by all means continue.
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.
The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.
There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...
Any game developed wont be held back be xbox being weaker then.
Jesus Christ. As someone who has worked at a major game studio, if you don't think this 'photoshopping' issue you guys are hung up on is happening at every studio, I got a bridge in Brooklyn to sell you.
Hahahaha, wow, just imagine how great the PS2 generation would have been if the Xbox had massively outsold the PS2, then!If sales for Xbox are low (cant see it tho) then the developers wont bother developing for the system; Wii U syndrome...
Any game developed wont be held back be xbox being weaker then.
So they're capable of storing an extra 30fps in the cloud?
Forza 5 90fps confirmed (online only).
No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.
Confirmed 1080p native.
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.
Now when I say that many people say, "but we want the best possible graphics". It turns out that they're not incompatible. If you look at how the GPU and its various sub-components are utilised throughout the frame, there are many portions throughout the frame - for example during the rendering of opaque shadowmaps - that the bulk of the GPU is unused. And so if you're doing compute for collision detection, physics or ray-casting for audio during those times you're not really affecting the graphics. You're utilising portions of the GPU that at that instant are otherwise under-utilised. And if you look through the frame you can see that depending on what phase it is, what portion is really available to use for compute.
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.
Stop trying to spread FUD. Again, there is no hard split on CUs dedicated to GPGPU tasks.
From Cerny's own mouth:
Again, resources do not have to be taken away from graphical capabilities to utilize GPGPU functionality.
This is a number we definitely need announced from Sony because with the architecture of the box, its natural for its GPU to picking up a lot of the work for the OS and other miscellaneous tasks performed by the game. I honestly feel like they're holding back on a lot of this information to fuel wars like this.
How much of the GPU is reserved for the OS?
Unless MS magically created a solution to minimize latency/make it a non-issue; you won't see cloud implementation for real-time AI in any meaningful way.No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.
The main job of OS is to control the processes. It's not the kind of workload that can be done on a GPU. Also if a game needs 4 CUs for compute and 14 CUs for graphics, it doesn't mean 4 CUs are off-limit to graphics. GPUs don't work like that. It means for each second, 4/18 seconds will be used for compute while 14/18 seconds will be used for compute.
If a multiplat game needs 4 CUs for compute, you can be sure that xbone version will need 4 CUs for compute as well, which means xbone will have 8 CUs for graphics while ps4 has 12.
Is this where you get told to stop spreading FUD?
Do you know what the PS4 reserves?
Do you know what the PS4 reserves?
Why should the PS4 reserve any non-trivial GPU time for anything other than the game when the game is the only application rendering graphics during gaming? The XBO runs snapped Metro apps concurrently with the actual games, and since Metro is a hardware-accelerated UI and the WinRT-based apps have access to rendering capabilities, GPU time reservation is necessary. This does not apply to the PS4 which simply doesn't have the snap feature. As others already said, background tasks are much better put on the CPU.
At a guess, I'd say the same as PS3 for popup notifications etc. probably sub 1%?
No, I am not saying to that degree but for a launch game like forza 5 with cloud calculated ai AT LAUNCH you will get ai cloud controled enemies in fps's like halo 5 in the future. It will get further each time.
Is this where you get told to stop spreading FUD?
Do you know what the PS4 reserves?
In what sense am I spreading FUD? Don't start taking things personally now.
2 cpu cores for OS and 2.5 ~ 2 GB of ram is all we know so far. And ps4 doesn't support snap which means games can be suspended while using ps4's UI. So I don't think there'll be as much hit on GPU time compared to xbone which needs to render the game and the OS UI at the same time.
pls stop this cloud pr regurgitation bs
It's not personal
The main job of OS is to control the processes. It's not the kind of workload that can be done on a GPU. Also if a game needs 4 CUs for compute and 14 CUs for graphics, it doesn't mean 4 CUs are off-limit to graphics. GPUs don't work like that. It means for each second, 4/18 seconds will be used for compute while 14/18 seconds will be used for compute.
If a multiplat game needs 4 CUs for compute, you can be sure that xbone version will need 4 CUs for compute as well, which means xbone will have 8 CUs for graphics while ps4 has 14.
The following is my original post. Please point out where I'm spreading FUD.
All of it is an assumption on how you think it works, none of it is accurate.
All of it is an assumption on how you think it works, none of it is accurate.
All of it is an assumption on how you think it works, none of it is accurate.
Dude. There is no such thing as "14 is all you need". No such thing. Devs will use whatever they can to bring the most to their games. It is never a case of "good enough", but a case of "how far can we go".Not implying that there is a HW limitation that there's a 14+4 split, but that probably 14 is all you would need to graphics. The rest 4 would have to be used for something else if you did not want them to be idle.
The finer point is that 50% tflop difference is quoted in the context of graphics and that's there is not going be 50% visual difference if you are not using all 18 CUs.
There is a related quote from Cerny here - http://av.watch.impress.co.jp/docs/series/rt/20130325_593036.html, where he seems to imply usage of 8 for Non-GPU tasks. Not sure whether the number 8 is CUs or not, though. But a summary of the same article here - http://**************/forums/topic/...th-your-real-name-your-legal-name-is-your-id/ seems to imply so. Now the summary says a 10-8 split, but I think Cerny cites the usage of 8 only as an example (use google translate on the original interview, its almost gibberish, but still).
The statement that you don't need to assign a fixed number of CUs to a category of tasks is accurate. GPU threads and thread groups are hardware-managed.
There's more to it than that.
I really can't say any more than that.
PS4 has secret sauce, lol.
There's more to it than that.
I really can't say any more than that.
PS4 has secret sauce, lol.
I really can't say any more than that.
There's more to it than that.
I really can't say any more than that.
PS4 has secret sauce, lol.
Must have bugged the heck out of you since you post all that. You would be better off observing.Ignored this thread mostly because I thought I could guess how stupid it could become. Then, out of curiosity, I poke my head in and find dudes blowing up a single frame of a pick to prove it was Photoshoped to hide clipping on an Xbone game. I don't know if it was or wasn't, if I had to guess I'd say it was, either way it doesn't make a lick of difference about anything at all. That being said, I know someone believes there's some scandal or winning going on here so by all means continue.