Especially when there's video that could have been used to argue the same point.It was a pretty pathetic lie to be honest. Hell who even lies about that kind of shit? It's ridiculous.
Especially when there's video that could have been used to argue the same point.It was a pretty pathetic lie to be honest. Hell who even lies about that kind of shit? It's ridiculous.
I demand to know your sources!The PS4 version of SF5 will have blue shadows!
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?
PS4 has:
50% more Shader units - 768 v 1152
50% more Compute units - 12 v 18
50% more Texture mapping units (TMU's) - 48 v 72
100% more Render output units (ROP's) - 16 v 32
Even if PS4 & Xbox One had the exact same memory bandwidth/subsystem there is still a huge disparity in GPU power for two consoles of the same generation (which are supposedly going head to head).
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?
PS4 has:
50% more Shader units - 768 v 1152
50% more Compute units - 12 v 18
50% more Texture mapping units (TMU's) - 48 v 72
100% more Render output units (ROP's) - 16 v 32
Even if PS4 & Xbox One had the exact same memory bandwidth/subsystem there is still a huge disparity in GPU power for two consoles of the same generation (which are supposedly going head to head).
Dreamcast had a real VGA adapter and a modem years before other consoles did. It was dramatically ahead of it's time. It's sad what happened to the DC, though a great deal of it was Sega's own fault.
Just out of interest how come there has been so much focus on memory bandwidth anyway? is it because both companies have taken different approaches? is Microsoft intentionally trying to distract us from the fact that their $100 more expensive console has a much more inferior GPU?
...
its amazing the universal love for the dreamcast. Damn i miss Sega.
Dreamcast had a real VGA adapter and a modem years before other consoles did. It was dramatically ahead of it's time. It's sad what happened to the DC, though a great deal of it was Sega's own fault.
Where did you get the TMU numbers from?
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.As the general consensus is still that this is bunk, and apparently a post I made a few pages ago was ignored, I'll venture a repost for the sake of on-topic dicussion:This being from the starting point, not a before-after comparison of the total performance (if I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.)
Alright if last gen was the move to HD, I want this gen to be the move to high IQ. I don't want to see aliasing anywhere. It ruins high quality visuals so much. Those 6 extra CUs on the PS4 better at least give me extra high IQ on games designed to not look too gimped on Xbox One.
So the same guy went a little more in detail about how sometimes, DDR3+eSRAM can be better than GDDR5. He may be full of shit, but sounds knowledgable.
So does this clear anything up or just muddy the waters further?
This is up to developers to an extent, you should really get into PC gaming if IQ is a such a significant priority.
I do play PC games. It lets me know how much poor IQ ruins console games. TTT2
I remember playing one of the NFL2K games online. Me and my brother on the same console, taking on all comers.remember playing virtua tennis over that 56k connection!!!!
and that sexy ass controller with pressure triggers, was so revolutionary back then, awesome for baseball games.
Oops, crap. Didn't see you posted this and posted his whole blurb. He does specifically say it's better for some things, but I'm wondering if that's true. Only because I've read everywhere that Sony's solution is not only more powerful, but more elegant. But I'm no techie.
If it turns out Xbox has the better GPU solution again, I think a lot of folks will be munching on crow.
Oh Joy, I never said it was better, just making the best use of what it's got. Its more "efficient" at what has got to play with.
Better? Thats pie in the sky stuff.
Yeah, then you know that it's a trade off situation, more graphics on the screen vs higher IQ.
I'd love for games to come with a priority option you can set in the settings menu.
Even though I'm not a techie, I love reading this stuff. Thank you!The scenario posted omits a couple things: caches, and memory controllers.
Looking at the GCN whitepaper, you'll see a block with Z$, C$ and Color ROP/Z Stencil ROP units. The Z$ is the depth buffer cache and the C$ is the color buffer cache.
It's only in the case of a miss (and when flushing) that you'll actually need to go out to main memory (or eSRAM) to fetch whatever information you need, and I hope that scheduling is smart enough to pipe fragment data from wavefronts to ROP groups that have already been working on that portion of the screen, as to minimize reading/flushing of the caches.
Also, in the case of depth (depending on operating mode) the rasterizer can cull entire triangles based on early-Z, and this process is sped up by having a hierarchically compressed Z buffer for fast min/max depth on a region.
Lastly, as the diagrams in the whitepaper show, there are several individual memory controllers. I don't know much about how memory controllers work, but I'd assume that they can each individually read or write on a given cycle (please correct me if I'm wrong here).
What confuses me is how he says it's more efficient at what it's got to work with. Granted, my knowledge of such things is limited, but wouldn't better efficiency mean better performance?He later replied in the same thread that he wasn't really saying the PS4 GPU was weaker. He was just talking about how to get most of the X-One.
PSU Post 164
Alright if last gen was the move to HD, I want this gen to be the move to high IQ. I don't want to see aliasing anywhere. It ruins high quality visuals so much. Those 6 extra CUs on the PS4 better at least give me extra high IQ on games designed to not look too gimped on Xbox One.
lol I got quoted.He later replied in the same thread that he wasn't really saying the PS4 GPU was weaker. He was just talking about how to get most of the X-One.
PSU Post 164
"PS4=Way more PC like, and straightforward, you don't really need to go looking for optimizations unless you somehow need them.
Xbox One=More like a supercharge 360 in terms of having access to that edram, which also makes it more complicated."
I have a feeling that this Gen will be the online everything Gen, with services up the wazooo.
not really happy about that, since I know we are going to get games with online components duck taped on.
So does this clear anything up or just muddy the waters further?
This thread is fun.
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.
lol I got quoted.
How he defined the platforms is inline with how I see both:
Man, a eSRAM with only 32 MEGA BYTES. I highly doubt it can do something significant in terms of graphics.
Moreover, industry will get rid of DDR3 as VRAM in a year or two. This will lead GDDR5 price down and industry uses it as standard VRAM.
So Xbox One specs will seem dated after 2 years.
Holy crap....how is this thread still raging on?
Lastly, as the diagrams in the whitepaper show, there are several individual memory controllers. I don't know much about how memory controllers work, but I'd assume that they can each individually read or write on a given cycle (please correct me if I'm wrong here).
I have enjoyed this. Memories of ps1, dreamcast and blue shadows.Holy crap....how is this thread still raging on?
You make it sound like Sony hasn't been one of the technical leaders of music hardware in the industry.
Just wanted to post that everyone that keeps bringing up launch/launch window/1st year titles to show the full power of the system should tame their expectations a bit.
Yes the hardware is closer to a PC than other generations, but the devs from both sides have been working on incomplete (as in alpha, beta and not final) dev kits for a good chunk of their time. Which means that they were never able to target final specs (until recently) because the specs keep changing.
Food for thought.
I have enjoyed this. Memories of ps1, dreamcast and blue shadows.
Agreed its been a fun read but at some point (if this happened sometime yesterday then excuse my ignorance) I wish some would have gotten DF to thoroughly spell out what the meant or how they came to their numbers as in were there provided to them without verification, etc...
To be fair, they probably tried to get clarification from Microsoft, and their tech guys are furiously working with their PR guys on how best to explain it without explaining a god damn thing.Some users have tried by tweeting them and the like, and many comments on the article page itself. No answers were given.
I'm surprised Digital Foundry is going to shit up it's reputation over it.
I know how MS could have got their original bandwidth figures out by 88%.
They probably worked them out in Excel.
I literally laughed out laud.
Edit: oh shit I'm not a junior now! when did that happen?
You are alright SPE. I promise to not abuse this.Laughing at my jokes. I'm a big cheese around these parts. You scratch my back, I'll scratch yours.
I keep telling people the GPU differences aren't going to manifest it self in real world. MS put a ton of extra hardware in the xbOne to allow for preventing stalls in the graphics pipeline. They've done a ton of research to determine how much GPU's sit idle and the points they most like sit stalled waiting for data.
Even with the few areas that the GPU is PS4 is faster ( and it's only in a few areas ) ps4 developers are going to run into stalls and having the GPU sit idle more then the GPU in the xbOne. It's going to make the performance much more of a wash then you are all expecting.
Something something due to you laughing at SPE cut the cheese. Hurray!I literally laughed out laud.
Edit: oh shit I'm not a junior now! when did that happen?
I know how MS could have got their original bandwidth figures out by 88%.
They probably worked them out in Excel.
A recap of this thread:
- DF link for "X1 eSRAM performance increase by 88%"
- 1st couple pages of people saying good news, and others asking if it makes it better than PS4, and even some saying "CBOAT fail" - CBOAT Fail
- Some members finding that the math is wrong and in conclusion = Downclock!
- Same members stating that the DF article is pure PR from MS
- Finding out that DF's source is MS directly (pure speculation imo) due to a Tweet. Tweet Image
- At the same time we had people not knowing the movie Spaceballs............
- Oldergamer kept pulling stuff out of his arse and when asked for proof, he kept refusing and his answer was always: "It's just a simple google search away!". Needless to say it didn't go too well for him. And let's not forget this golden post here:
- People also found out that specialguy is Rangers from B3D, not that it means much tbh.
- Meanwhile we had some members rambling about the cloud, cloud size, cloud cost, cloud etc....
- Then some members had a persecution breakdown of some sort and believed all of GAF was against them.
- Afterwards people started arguing about full console specs with some stating that the specs were not finalized or it's all about games (in a tech analysis thread...) and also started comparing PS3 and 360 once again.
- Then we started comparing E3 showings of Infamous SS and Forza 4. With Hawk269 saying he played the game at E3 without showing proof. The problem is that according to many, the game was not playable for anyone. Guess now he has more time for his "special showings"
Yep. This has been a pretty damn good thread so far.
Oh god...this can't be real can it? I've been using Excel to plan our company budget for years D:
Oh god...this can't be real can it? I've been using Excel to plan our company budget for years D:
This thread is fun.
If you have 10 apples and previously had 5 apples you have 100% more apples than previously. 50% more apples than an original amount of 5 apples is 7.5 apples. Perhaps the comment was ignored because it was based on a false premise.
If I have 5 apples and I double my apples to 10, it's a 100% increase in apples but also only 50% more apples when compared to the original amount.
Some users have tried by tweeting them and the like, and many comments on the article page itself. No answers were given.
I'm surprised Digital Foundry is going to shit up it's reputation over it.
You should have included the 2 ultimatums by Bish. This thread has had everything.
I was not aware that DF had a reputation to shit up in the first place. Richard Leadbetter is well known for being a blatant MS and 360 fanboy and because of this almost all of their cross-platform 360 Versus PS3 game comparisons were unreadable biased trash.
I'm sorry I don't follow what your example was supposed to show and how it relates to the 88%. Feel free to elaborate?For one, even assuming my specific example were wrong, I assume you can follow the reasoning to a point of being able to address it.
And secondly, a 100% increase to 5 apples gives you 10 apples. 5 apples is 50 percent of 10 apples.