• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EDGE: "Power struggle: the real differences between PS4 and Xbox One performance"

This is the visual difference i expect to see

ibmcPEgLpPfAax.png

ib0Sp0omcDMu1C.jpg


Do i think the order looks better? Yeah. By how much? Well that's debatable.

I can't really tell what image I prefer. Anyways. "Is the game fun, engaging, original" is the question I will judge games by. All the gfx talk is getting boring. Most wanted game nextgen? Resogun. And that thing could easily be multiplat.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
All the 18 CU's can be used in graphics and GPGPU stuff.

Most importantly, you don't have to assign a fixed number of CUs to GFX/GPGPU exclusively over the entire frame. The hardware can schedule workloads flexibly, which is one of the reasons why the PS4 has 8 ACEs. GPGPU can use more GPU time in those slots during the frame where the GPU is not saturated with GFX-related work.
 

mrklaw

MrArseFace
The only true difference you are going to see is going to be frame rate. There is a chance that resolution could come into play but I seriously doubt it.

When it comes down to it, the PS4 might be able to get 5-10 more frames per second on the screen in a hectic sequence over the Xbox One. Is that a big deal? I guess it's really going to be depend on the game and if you are sensitive to frame rates.

In the end, I expect intense games to be locked at 30fps despite being able to achieve higher frame rates if they pushed it. This way both the PS4 and XB1 will be identical and there literally will be no difference. Lesser intense games running at 60fps, the same deal.

Battlefield 4 is probably going to be the most interesting game out of the gate since unless something changed, 60fps is targeted on both systems. If the frame rate is locked at 60 (V-Sync) and both system can maintain it without any dips, XB1 is going to be fine. If it can't handle it, get ready for a shit storm.

Disagree. Resolution is much more flexible than framerate. You can relatively easily use a sub-1080 res and not be too distracting, but a sub 30fps will be much more noticeable.

Also with the Xbox one's display planes, you could have a game running at 900p (or whatever) and composite on a full 1080p HUD, which would further disguise any loss of resolution

I think - if it was a 1080 vs 900 thing - then most people would be perfectly happy, and it would only be in A:B comparisons that you'd be able to tell. Most people will only ever have be console at home so it would really affect them.
 

Piggus

Member
40% could mean one game would run 1080p in PS4 and 900p in Xbox One.
You need a trained eye in order to not notice that difference.

You don't need a trained eye to see that difference. 900p on a 1080p monitor is a bit blurrier than native 1080p. Try it with PC games.
 
According to alot of posts I am seeing that are only talking about TFLOP performance the PS3 is > then the X1.

PS3 RSX = 1.8TFLOPs

X1 = 1.37TFLOPs

Point is, TFLOP performance is only the start of the equation when it comes to effectiveness of a GPU.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
XBO: 1.31TF
PS4: 1.84TF

It gives a representation of what to expect.

Of course the PS4 GPU is a fair bit stronger. And I fully expect to see that difference in multiplatform games going forward.
I'm more interested in trying to find out how big of a gap will that be. Realistically. What are likely to be the differences.
If (IF) the Xbox One has a better audio chip that could relieve a cpu core from audio work. How could that spare cpu core be used to close the GPU delta. Could it be used to pre-cull geometry say.
 

Biker19

Banned
I'm not sure if it was just a rumor or not:
I've heard many times during this generation that MS wouldn't let multiplat games come to their console if the quality/graphics weren't on par with the competitor's console version. Is that true?

If it is, then Microsoft are very dumb. They don't exactly have a very strong hold on the gaming industry like they do on PC's.

3rd party publishers/developers will most likely make their future games PS4 exclusive should it ever comes down to that. They just need to suck it up & deal with it like Sony has dealt with 3 generations of inferior multiplats with PS1 to PS3.
 

nib95

Banned
According to alot of posts I am seeing that are only talking about TFLOP performance the PS3 is > then the X1.

PS3 RSX = 1.8TFLOPs

X1 = 1.37TFLOPs

Point is, TFLOP performance is only the start of the equation when it comes to effectiveness of a GPU.

Covered this in another thread.

Those are going by Nvidia's still fluffed numbers. The real numbers look more like this.

PS3 | RSX: 176 Gflops and Cell: 230 Glops, Total 406 Gflops

360 | Xenos: 240 Gflops and CPU: 77 Gflops, Total 317 Glops

PS3 based on raw performance is 28% more powerful than the 360.

The reason the raw performance figures did not line up with multi platform titles is because Cell and the PS3's RSX were notoriously difficult to develop for. Non unified split ram, multiples SPE's, less overall memory to work with etc. The GPU was actually weaker, and could only overcome it piggy backing off some heavy handed Cell SPE usage. Sony first party had the time and development resources to do this, which is why PS3 first party titles are the best looking and most technically impressive this generation.

Very different situation now...

PS4 | GPU: 1.84 Tflops and CPU: 100 Glops, Total 1.94 Tflops

Xbox One | GPU: 1.31 Tflops and CPU: 109 Gflops, Total 1.41 Tflops

PS4 based on raw performance is 38% more powerful than the Xbox One, but without any of the previous issues that plagued the PS3, and with a whole host of other advantages over the XO. This time it's the PS4 with the unified ram, the higher ram bandwidth, the higher ram availability etc. It's a completely different situation.
 
Disagree. Resolution is much more flexible than framerate. You can relatively easily use a sub-1080 res and not be too distracting, but a sub 30fps will be much more noticeable.

Also with the Xbox one's display planes, you could have a game running at 900p (or whatever) and composite on a full 1080p HUD, which would further disguise any loss of resolution

I think - if it was a 1080 vs 900 thing - then most people would be perfectly happy, and it would only be in A:B comparisons that you'd be able to tell. Most people will only ever have be console at home so it would really affect them.

Maybe if your blind 1080 vs 900 is huge , worst if you sit close to your tv or it big .
Most people won't notice a few frame drops and motion blur helps but they will notice a bad IQ and blurriness .
 
Most importantly, you don't have to assign a fixed number of CUs to GFX/GPGPU exclusively over the entire frame. The hardware can schedule workloads flexibly, which is one of the reasons why the PS4 has 8 ACEs. GPGPU can use more GPU time in those slots during the frame where the GPU is not saturated with GFX-related work.

ah, interesting. thanks.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
If (IF) the Xbox One has a better audio chip that could relieve a cpu core from audio work. How could that spare cpu core be used to close the GPU delta. Could it be used to pre-cull geometry say.

The issue with the audio hardware is that its equivocation with a CPU core heavily depends on the game. The game actually needs to do some heavy real-time audio stream processing. This figure has its origins in a statement from a developer on B3D that Forza 4 used quite some processing time on the CPU for audio for a specific effect, and that such effects in very high quality (even higher than in Forza 4) might use a single Jaguar core. I guess that those effects are adjustments to the engine sound depending on the revs of the engine and the echo produced by the environment (e.g. in tunnels).

However, if you don't have such effects in that quality then you don't save that equivalence of CPU time. In most games, the CPU time dedicated to audio is almost negligible. I posted some sources for this claim here: http://www.neogaf.com/forum/showpost.php?p=79922221&postcount=1792
 
Of course the PS4 GPU is a fair bit stronger. And I fully expect to see that difference in multiplatform games going forward.
I'm more interested in trying to find out how big of a gap will that be. Realistically. What are likely to be the differences.
If (IF) the Xbox One has a better audio chip that could relieve a cpu core from audio work. How could that spare cpu core be used to close the GPU delta. Could it be used to pre-cull geometry say.
CPU's can be programmed to do anything. The question is does that impact performance elsewhere? They may not be suited for it, but CPU's are the extreme end of versatility in processors. GPU's have a tighter focus with consistent performance across the spectrum on the rendering end.

So yes the CPU can be programmed to offload anything on the visual end. But whether it's suited for it is another question entirely. On this I have no clue. I barely understand the slightly simplified GPU portion of the conversation. Not even thinking about the infinitely more complex CPU part of the equation.
 

x-Lundz-x

Member
Of course the PS4 GPU is a fair bit stronger. And I fully expect to see that difference in multiplatform games going forward.
I'm more interested in trying to find out how big of a gap will that be. Realistically. What are likely to be the differences.
If (IF) the Xbox One has a better audio chip that could relieve a cpu core from audio work. How could that spare cpu core be used to close the GPU delta. Could it be used to pre-cull geometry say.

The fact they are trying to use an audio chip as evidence that will help bring their system into parity should speak volumes. That's like saying well we put better exhaust pipe on our car so that will help the fact we are running a V6 compared to your V8.
 
The fact they are trying to use an audio chip as evidence that will help bring their system into parity should speak volumes. That's like saying well we put better exhaust pipe on our car so that will help the fact we are running a V6 compared to your V8.
Pop-off instead of blow-in valve is a better comparison.
 
I don't understand what's so biased about it. Do you visually see a 40% difference? No...
Let's be clear, you weren't biased in posting the comparison. But the comparison is biased in favor of reducing the differences between the two machines. These are shots of main player characters, and as such they will be the focus of all sorts of extra effort in modeling and texturing. The disadvantages of weaker hardware will be more easily seen on NPCs, and in the world (especially distant detail).

For example, in the Order trailer, all the characters look to have roughly the same amount of detail. That's not the case in Ryse, where the barbarians especially are comparatively low-poly and repetitively modeled and textured. But even Romans besides Marius don't look nearly as good:

Ryse%20-%20Cinematic%201%20-%201080.png
 

TechnicPuppet

Nothing! I said nothing!
Maybe if your blind 1080 vs 900 is huge , worst if you sit close to your tv or it big .
Most people won't notice a few frame drops and motion blur helps but they will notice a bad IQ and blurriness .

1080p vs 900p is not huge, you wouldn't need to be blind not to see it.

Nobody I know can tell 720p and 1080p apart on my 50" plasma. I can, just about but I don't have anything at 900p to test.

No one noticed that KI was at 720p either till developers pointed it out did they?
 

Majanew

Banned
PS4 GPU is 1.84 Tflops. This has been known for the longest time. Where on Earth did you get the 1.79 figure from?

Here's how it stacks up.

PS4 | GPU: 1.84 Tflops and CPU: 100 Glops, Total 1.94 Tflops

Xbox One | GPU: 1.31 Tflops and CPU: 109 Gflops, Total 1.41 Tflops

PS4 based on raw performance is 38% more powerful than the Xbox One.


GPU's directly compared.

PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 69gb/s+ 32MB ESRAM @109gb/s
XB1 CPU = 112 GFLOPS
PS4 CPU = 102 GFLOPS


Nice to see they got rid of the clipping in the armor there:

rysescreeny.jpg

Unless it's about to clip before that capture was taken lol
 
Still shots will rarely tell anyone much when comparing the Xbone and PS4. My thought is that when in motion the differences will be most notable then. Then things like framerate, texture pop in, tearing, etc will be much more apparent.

It will be interesting to see if a game ever offers 30 FPS on Xbone and 60 FPS on PS4. I am not even sure if a developer would do that even if they could, but you never know.

Edit: on second thought maybe still shots will show some differences down the road. Better textures and AA is always a possibility I suppose.
 

SapientWolf

Trucker Sexologist
According to alot of posts I am seeing that are only talking about TFLOP performance the PS3 is > then the X1.

PS3 RSX = 1.8TFLOPs

X1 = 1.37TFLOPs

Point is, TFLOP performance is only the start of the equation when it comes to effectiveness of a GPU.
The only reason why the TFLOP ratings hold so much weight when comparing the X1 and the PS4 is because the GPUs are so similar. RSX and X1 are apples and oranges.
 
I can't really tell what image I prefer. Anyways. "Is the game fun, engaging, original" is the question I will judge games by. All the gfx talk is getting boring. Most wanted game nextgen? Resogun. And that thing could easily be multiplat.

God, these kinds of posts are becoming so annoying in these tech threads.
And resogun? It's published by SCE.
 
1080p vs 900p is not huge, you wouldn't need to be blind not to see it.

Nobody I know can tell 720p and 1080p apart on my 50" plasma. I can, just about but I don't have anything at 900p to test.

No one noticed that KI was at 720p either till developers pointed it out did they?

Saying nobody you know can't tell the difference between 720p and 1080p don't mean much without saying how you show them .
Some people did notice plus seeing something in vids not the same as seeing it on your TV still don't get me wrong KI looks nice .
Thing is having lower res would not be a problem if devs uses better AA sadly they don't .

EDIT pics are trash for showing off games in this day and age .
Using HQ gifs or vids much better IMO .
 

Thrakier

Member
1080p vs 900p is not huge, you wouldn't need to be blind not to see it.

Nobody I know can tell 720p and 1080p apart on my 50" plasma. I can, just about but I don't have anything at 900p to test.

No one noticed that KI was at 720p either till developers pointed it out did they?

Haha, amazing. Simply amazing.

Also, KI is at 720p? :D
 
It's not a rumor any more if it's debunked.
When did Cerny say PS4 GPU is unbalanced?
I'm not into techno-waffle on the levels people discuss on here, as I do not have an understanding of what is being mentioned - however Cerny did say the GPU has more of some certain techno-waffle than would be typically found, to help with GPGPU functions I believe he said.

Note he said more - so doesn't take away resource, it is provided in addition to.

I guess the unbalanced thing comes from that, if it is not utilised (and I think it was also the same video where he said he didn't expect it to be used right away), then you've got 'too much' of whatever techno-waffle he was on about and having too much is unbalanced.
 

TechnicPuppet

Nothing! I said nothing!
Saying nobody you know can't tell the difference between 720p and 1080p don't mean much without saying how you show them .
Some people did notice plus seeing something in vids not the same as seeing it on your TV still don't get me wrong KI looks nice .
Thing is having lower res would not be a problem if devs uses better AA sadly they don't .

EDIT pics are trash for showing off games in this day and age .
Using HQ gifs or vids much better IMO .

It's purely anecdotal.

After reading all the comments I'd rather a resolution hit than anything else especially screen tearing or noticeable slowdown. But that is me hoping I won't be able to tell the difference.
 

ElTorro

I wanted to dominate the living room. Then I took an ESRAM in the knee.
I guess the unbalanced thing comes from that, if it is not utilised (and I think it was also the same video where he said he didn't expect it to be used right away), then you've got 'too much' of whatever techno-waffle he was on about and having too much is unbalanced.

I guess what he meant is that it is just harder to avoid getting bottlenecked by some resource if you use all CUs for GFX than if you use some for GPGPU since many compute tasks should have lower demand for some resources (e.g. bandwidth) compared to graphics. At least, if you try to maximize hardware utilization.
 

RoboPlato

I'd be in the dick
1080p vs 900p is not huge, you wouldn't need to be blind not to see it.

Nobody I know can tell 720p and 1080p apart on my 50" plasma. I can, just about but I don't have anything at 900p to test.

No one noticed that KI was at 720p either till developers pointed it out did they?
The only reason no one could tell with KI was because Microsoft was only releasing YouTube videos of it.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
I wonder if devs find the real world performance of Xbox One to be around 20% slower. Then I wonder if they could use a resolution of 1600 x 1080 scaled to full screen. It's 20% less pixels and it would allow you to keep the 1080p vertical resolution.
Our eyes are more sensitive to vertical resolution.
 

AngryBird

Banned
I don't understand what's so biased about it. Do you visually see a 40% difference? No...


These type of comparisons are always biased because that person always chooses a crappy picture for the other game and a good one from his game in order to prove his point.

You got a promo pic of Ryse and a pic taken from a fucking pp slideshow of The Order.
The games are very different not to mention they are far apart in the development cycle.
One is a FPS and the other is a TP Game. Crytek may have wanted the characters to be more detailed (shift priority) because they may feel it's essential part of the game, because it's a sword fighting game and there is a lot of close up interaction(fighting) between them.

There is absolutely no fucking point in comparing one game to another game, as devs can choose where to prioritize the resources of the console, depending what they are trying to achieve or what part of the game is more important.


You can do this comparison with some driving games because they have a common theme, but not Ryse and The Order.
 
I'm not into techno-waffle on the levels people discuss on here, as I do not have an understanding of what is being mentioned - however Cerny did say the GPU has more of some certain techno-waffle than would be typically found, to help with GPGPU functions I believe he said.

Note he said more - so doesn't take away resource, it is provided in addition to.

I guess the unbalanced thing comes from that, if it is not utilised (and I think it was also the same video where he said he didn't expect it to be used right away), then you've got 'too much' of whatever techno-waffle he was on about and having too much is unbalanced.

im pretty sure you are talking about the gpgpu scheduling and queueing that the ps4 will do which means you can create gpgpu code and when its being run on the console it will be queued up for execution and the ps4 will schedule it for a time that a particular CU isn't being utilized for graphics which can be a common occurrence during specific times in the render pipeline. So in a sense its free computational power as it doesn't take anything away from the graphics processing its also not unbalanced if people decide not to use it because if they did not implement this design the console could still have the exact same power but occasions during frame rendering where the CU's are idle.
 
Nice to see they got rid of the clipping in the armor there:

rysescreeny.jpg

Unless it's about to clip before that capture was taken lol
Holy cow, you're right! If you check my version of the screenshot--which comes from the official Xbox Wire website--the clipping has obviously been photoshopped out. Wow.
 

SapientWolf

Trucker Sexologist
Quick demos to the untrained eye are never a good indicator of resolution. If we had high quality video it would have been obvious.
Video compression causes a substantial IQ hit, even in the best of circumstances (i.e. Gamersyde). The real indicator was the fact that the official screens were at 720p.
 

velociraptor

Junior Member
Of course the PS4 GPU is a fair bit stronger. And I fully expect to see that difference in multiplatform games going forward.
I'm more interested in trying to find out how big of a gap will that be. Realistically. What are likely to be the differences.
If (IF) the Xbox One has a better audio chip that could relieve a cpu core from audio work. How could that spare cpu core be used to close the GPU delta. Could it be used to pre-cull geometry say.
It is more important to have a beefier GPU than a beefier CPU. All rendering tasks are handled by the GPU. Fact is, no amount of free CPU cycles is going to close that GPU gap.
 

velociraptor

Junior Member
Isn't Quantum Break just target footage in that shot? Certainly isn't in-game whatever the case,so it's a bit pointless comparing any of these games graphically until we're sure we're seeing actual in-game imagery from each of them.
I'm very curious to see how Quantum Dream holds up into gameplay. It very much looks 'next-gen'.
 
I have always wanted to ask this and now as this thread is full of tech people: How many flops does modern desktop processors have? A lot more than Jaguars in PS4 and Xbox One? Like Intel Core i7 4770K or AMD FX-8350 for example.
 

Durante

Member
Anyway, being in the tech world for the past 25 years and being a part of the gaming world for just as long
As what?

Even then though, the PS4 just has far more raw GPU power. 32 rendering processors compared to 16 is a major real world difference. Being able to feed 32 processors to render your output means well .. pretty simply that the PS4 will be able to render a scene twice as fast. But added in with the memory bandwidth from the GDDR5, you have the PS4 being fed more information faster and processing that information twice as fast to the television..
I've never seen ROPs (render output units) called "rendering processors" before. That seems a bad name, and it informs some of the misconceptions that you seem to believe in. having twice as many ROPs would only make an architecture twice as fast at rendering a scene if said scene was 100% ROP limited at 100% of the time. This is never a realistic scenario.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
It is more important to have a beefier GPU than a beefier CPU. All rendering tasks are handled by the GPU. Fact is, no amount of free CPU cycles is going to close that GPU gap.

The PS3 used the cpu to good effect to pre-cull geometry to help the relatively weak GPU.
 
It's been at a ton of game shows. No one noticed, probably because they were using 720p displays.
Actually, plenty of people noticed. If you look back at impressions from those shows, several were saying things like "a lot jaggier than I expected". Since AA doesn't look awful on KI, I think it's probable they were seeing the effects of upscaling.
The PS3 used the cpu to good effect to pre-cull geometry to help the relatively weak GPU.
That's because the PS3 CPU was more powerful than the GPU in Gflops, and had a weird design that handled tons of parallel instructions well. Neither of those things is really true about the One's CPU/GPU combo (though the CPU is somewhat parallel with its large number of cores).
 

TechnicPuppet

Nothing! I said nothing!
Quick demos to the untrained eye are never a good indicator of resolution. If we had high quality video it would have been obvious.

Or the trained eye in some cases. There was an interesting article on EG that I think is pretty relevant here. It was about the GTA4 face off. This part if about performance rather than resolution though.

the fact that various outlets were confidently saying that both versions had the performance advantage didn't help

http://www.eurogamer.net/articles/digitalfoundry-what-gta4-did-for-us
 
According to alot of posts I am seeing that are only talking about TFLOP performance the PS3 is > then the X1.

PS3 RSX = 1.8TFLOPs

X1 = 1.37TFLOPs

Point is, TFLOP performance is only the start of the equation when it comes to effectiveness of a GPU.
The only thing that proves is that Sony used PR math when they announced the PS3 as a 2TFLOP machine (as did MS with their 1TFLOP number for 360). Neither PS3 nor 360 are anywhere close to those numbers.

This time we have been given the actual numbers, measured by AMD using the same process used with their line of PC GPUs, They are accurate as far as GPU computing power goes and thus meaningful.
 
According to alot of posts I am seeing that are only talking about TFLOP performance the PS3 is > then the X1.

PS3 RSX = 1.8TFLOPs

X1 = 1.37TFLOPs

Point is, TFLOP performance is only the start of the equation when it comes to effectiveness of a GPU.

MS and Sony straight up lied about the PS3 and 360 performance. In reality the RSX is around 176 GFlops.
 

slapnuts

Junior Member
PS4 GPU is 1.84 Tflops. This has been known for the longest time. Where on Earth did you get the 1.79 figure from?

Here's how it stacks up.

PS4 | GPU: 1.84 Tflops and CPU: 100 Glops, Total 1.94 Tflops

Xbox One | GPU: 1.31 Tflops and CPU: 109 Gflops, Total 1.41 Tflops

PS4 based on raw performance is 38% more powerful than the Xbox One.


GPU's directly compared.

PS4: 1.84TF GPU ( 18 CUs)
PS4: 1152 Shaders
PS4: 72 Texture units
PS4: 32 ROPS
PS4: 8 ACE/64 queues
8gb GDDR5 @ 176gb/s

Verses

Xbone: 1.31 TF GPU (12 CUs)
Xbone: 768 Shaders
Xbone: 48 Texture units
Xbone: 16 ROPS
Xbone: 2 ACE/ 16 queues
8gb DDR3 @ 69gb/s+ 32MB ESRAM @109gb/s

Yes i totally agree with the math here..it seems just about right imho
 

Durante

Member
I have always wanted to ask this and now as this thread is full of tech people: How many flops does modern desktop processors have? A lot more than Jaguars in PS4 and Xbox One? Like Intel Core i7 4770K or AMD FX-8350 for example.
I assume you mean single precision (32 bit) FLOPs?

PS4 (8 Jaguar cores at 1.6 GHz, 128 bit MAD per clock) = 8*1.6*4*2 = 102.4 GF
i7 4770k (4 Haswell cores at 3.5 GHz, 256 bit MAD + 256 bit ADD per clock) = 4*3.5*8*3 = 336 GF
FX-8350 (4 Vishera modules at 4 GHz, 256 bit MAD per clock) = 4*4*8*2 = 256 GF

The usual disclaimers apply. These are theoretical numbers for all the chips, and how close you get in practice is entirely dependent on the workload.
 
I assume you mean single precision (32 bit) FLOPs?

PS4 (8 Jaguar cores at 1.6 GHz, 128 bit MAD per clock) = 8*1.6*4*2 = 102.4 GF
i7 4770k (4 Haswell cores at 3.5 GHz, 256 bit MAD + 256 bit ADD per clock) = 4*3.5*8*3 = 336 GF
FX-8350 (4 Vishera modules at 4 GHz, 256 bit MAD per clock) = 4*4*8*2 = 256 GF

The usual disclaimers apply. These are theoretical numbers for all the chips, and how close you get in practice is entirely dependent on the workload.

Thanks! Quite difference. And yeah even I know that flops really don't tell everything. CELL having over a double flop performance of PS4 CPU....
 
Top Bottom