• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

EA: "Next tablets/phones will have nearly 360/PS3 capabilities in terms of graphics"

velociraptor

Junior Member
I've been able to do this for the past year and a half or so and my laptop is not a super powerful one.

What are your specs? I've got a standard Dell laptop that I purchased 2/3 years ago. i3 processor, 4GB RAM, Raedon 5470. I can only play old PC games - Unreal Engine 2.0
 

LukeTim

Member
What are your specs? I've got a standard Dell laptop that I purchased 2/3 years ago. i3 processor, 4GB RAM, Raedon 5470. I can only play old PC games - Unreal Engine 2.0

You can at least play Source games.

I was playing CS:S, DoD:S, HL2, L4D etc. on my Laptop I got in 2007 with a Core Duo 1.86GHz, 1GB DDR2 RAM and GeForce 7400 Go.

Heck, pretty sure I played Medieval 2: Total War on it, and at that time that was a fairly demanding game.
 
So? I just bought Bastion for my iPad and stopped playing after five minutes thanks to the horrible, horrible controls. I'd rather play nothing than that mess. This has been my experience with basically any tablet/smartphone game I played.

Tablets are shit for anything but tower defense games, they can have the greatest graphics and gaming on them will still be crap. If tablets were all that's left of gaming, I'd stop playing games in a heart beat.
 
So why do xbox 360 games lack the complexity and depth of games from 1999 like unreal tournament or quake 3? Which people played on pentium II and IIIs
(i know what you are saying I just want to take a stab at the consoles and point out that the cpu power isn't actually used for gameplay)

Nonesense.

Name one game from 1999 with the same level of complexity in gameworld physics and interaction, procedural animation systems, AI and even half of the other stuff modern games use PS360 CPUs to do?

The PS3 CPU alone was potent enough and fast enough to be used to close the gap between its weak-ass GPU and Xenos on a plethora of games. It's been used for vertex culling, AA, lighting calculations and all sorts.

Modern games are exceedingly complex, and its not just because of the amount of vertex and pixel pushing power contained within their GPUs. GTAIV with its large number of pedestrians and procedural/physics based animation systems, Skyrim also with its radiant AI and massive gameworld would both be impossible to do on an iPAD for at least another half-decade.

Sure you can point to a bunch of linear simplistic PS360 games with low CPU usage like BF3 and say yeah they can run on tablets, but at the top end the CPU performance of ARM cores is just not there yet; and will certainly take much longer to be so because of limits being reached in semiconductor process node shrinkage going forward, and CPU power consumption not scaling with performance the same way it does with mobile GPU chipsets.

Point is you won't see mobile CPU running at 3+GHz ever. Nor will you see CPUs on mobile platforms that can push the same number of operation per clock as the Xenon or Cell, for a very very long time, if not ever.

Also if you think that all or even most PS360 games don't make alot of use of the system CPUs then you really don't know much about game development at all. CPU usage in these systems are maxed out most of the time, especially the CELL.
 

LukeTim

Member
Nonesense.

Name one game from 1999 with the same level of complexity in gameworld physics and interaction, procedural animation systems, AI and even half of the other stuff modern games use PS360 CPUs to do?

The PS3 CPU alone was potent enough and fast enough to be used to close the gap between its weak-ass GPU and Xenos on a plethora of games. It's been used for vertex culling, AA, lighting calculations and all sorts.

Modern games are exceedingly complex, and its not just because of the amount of vertex and pixel pushing power contained within their GPUs. GTAIV with its large number of pedestrians and procedural/physics based animation systems, Skyrim also with its radiant AI and massive gameworld would both be impossible to do on an iPAD for at least another half-decade.

Sure you can point to a bunch of linear simplistic PS360 games with low CPU usage like BF3 and say yeah they can run on tablets, but at the top end the CPU performance of ARM cores is just not there yet; and will certainly take much longer to be so because of limits being reached in semiconductor process node shrinkage going forward, and CPU power consumption not scaling with performance the same way it does with mobile GPU chipsets.

http://en.wikipedia.org/wiki/Instructions_per_second

If you take a look at the DIPS/Clock and DIPS/Clock/Cores columns in the table here, you will see that Qualcomm Krait outperforms the PowerPC cores in both the PS3 and 360.

A Quad-core Krait at 2.0GHz would likely exceed the 360's Xenon and potentially match - or come very close to - the PS3's Cell.

Sure, I know, comparing processors on Instructions Per Second isn't a perfect comparison, because architectural and instruction set differences make it a somewhat grey metric to use... But what else are you going to use? At least this is a comparison between RISC architectures, so it's a fairly reasonable comparison.

Mobile processors have come a long way in the last 8 years in terms of efficiency and capability, due to the rise of smartphones encouraging massive investment.

EDIT:

Point is you won't see mobile CPU running at 3+GHz ever.

If this is the crux of your argument, you evidently don't understand CPUs. One CPU of a certain architecture does not necessarily have to exceed another's clock speed in order to outperform a CPU of a different architecture. A 2GHz MIPS, for example, might be capable of outperforming a 3GHz PowerPC in terms of raw instructions per second. Similarly, a 2.4GHz Core i5 is going to outperform a 3.5GHz Pentium 4. It depends upon what's going on inside, how instructions are being executed by the pipeline, the depth of the pipeline, how the pipeline is structured etc.
 

Kainazzo

Member
a 2.4GHz Core i5 is going to outperform a 3.5GHz Pentium 4. It depends upon what's going on inside, how instructions are being executed by the pipeline, the depth of the pipeline, how the pipeline is structured etc.

I've wondered about this somewhat. As clockspeed increases, does performance eventually taper off? Years ago, I once saw a Pentium 4 overclocked to ~7.2 GHz, and the performance gained in benchmarks was negligibly higher than it was at ~4 GHz. In fact, 4 GHz was roughly the cutoff point, and I've noticed that no CPU has ever been sold stock at that speed. Is this because, regardless of architecture, performance somehow peaks there?

Or, was it simply that benchmark software in 2005 wasn't coded to handle such speeds?
 

LukeTim

Member
I've wondered about this somewhat. As clockspeed increases, does performance eventually taper off? Years ago, I once saw a Pentium 4 overclocked to ~7.2 GHz, and the performance gained in benchmarks was negligibly higher than it was at ~4 GHz. In fact, 4 GHz was roughly the cutoff point, and I've noticed that no CPU has ever been sold stock at that speed. Is this because, regardless of architecture, performance somehow peaks there?

Or, was it simply that benchmark software in 2005 wasn't coded to handle such speeds?

Well, theoretically, when you increase the clock-speed of a CPU architecture that remains consistent, performance will increase linearly.

However, there are various reasons that overclocking beyond a certain point provides little benefit to performance, this may be due to the benchmark software used (for example, what metrics and scale do they use to quantify the performance? the scale may be non-linear, and designed to work best within an expected range). However, also timings will eventually get too quick so that signals will become increasingly likely to miss their destination in time for the next clock cycle. If you have a wire that carries a signal over a specific length with a particular expected propagation time, it's only going to be able to carry the signal in time for clock speeds up to a certain point to work consistently. If the signal doesn't quite make it in time you may end up in limbo where there is neither a 1 nor a 0 at the end point when the clock cycles. Or if it is too late, you will end up with getting whatever was there last time.

This introduces errors which will negatively impact performance.

There are almost certainly other problems, but I cannot think of them at the moment. It's been a while since I studied this sort of stuff. Pretty sure somebody else will come along and give more information anyway, there's plenty of folks who know a lot more than me around here.

There's also the cooling required to hit those speeds which would prevent stock CPUs from being sold at higher clocks.
 

Vestal

Gold Member
I've wondered about this somewhat. As clockspeed increases, does performance eventually taper off? Years ago, I once saw a Pentium 4 overclocked to ~7.2 GHz, and the performance gained in benchmarks was negligibly higher than it was at ~4 GHz. In fact, 4 GHz was roughly the cutoff point, and I've noticed that no CPU has ever been sold stock at that speed. Is this because, regardless of architecture, performance somehow peaks there?

Or, was it simply that benchmark software in 2005 wasn't coded to handle such speeds?

there are multiple of reasons for this.

1. the GHZ race stopped back in the P4/Athlon era because these CPUs were getting ridiculous in terms of heat and power draws. AMD Proved with its Athlon series that higher clock speeds could be overcome with solid engineering.

2. After that time we have moved on to a more mobile era were you are basically trying to shink the size of the CPU, lower power and lower heat. It has forced Intel and AMD to become more efficient.

3. Performance definitely does not peak, however you will start getting diminishing returns depending on the architecture used, at least that's what I read in an article a few years back.

4. That overclock you mention if I am not mistaken was mostly the Mutiplier on the CPU. So therefore the other components on the pc will fall behind and not allow the entire benefits of the speed increase to be felt.


Post above me also has some solid points.
 

Stumpokapow

listen to the mad man
And we won't care nor buy them, thanks EA.

Total iOS/Android devices sold since 2007: 1.5 billion
Total gaming consoles sold ever (combining handhelds and consoles from all companies since the beginning of gaming): ~1.3 billion

BziztlN.gif
 
The people talking about buttons really need to check out games like Rayman Jungle Run and Bastion. And like others have said, games that insist on using buttons will be helped by iOS 7.

I really don't get how people still underestimate mobile/tablet gaming so much. I've played games on the iPad that are already visually stunning, and technology isn't going to stop moving forward. It's not like tablet/phone gaming is going to kill consoles, people should just embrace both. I'm not saying that the next generation of phones and tablets will be as strong as the PS3 or the 360 because I have no clue, but that would be awesome for everyone as far as I'm concerned.
 

Spira

Banned
Sure, and a ten minute max battery, with your standard 16 32 64 gigs of space... Which lemme tell ya.. It's a lot...
 

LukeTim

Member
The people talking about buttons really need to check out games like Rayman Jungle Run and Bastion. And like others have said, games that insist on using buttons will be helped by iOS 7.

I really don't get how people still underestimate mobile/tablet gaming so much. I've played games on the iPad that are already visually stunning, and technology isn't going to stop moving forward. It's not like tablet/phone gaming is going to kill consoles, people should just embrace both. I'm not saying that the next generation of phones and tablets will be as strong as the PS3 or the 360 because I have no clue, but that would be awesome for everyone as far as I'm concerned.

Aren't most people fairly happy to accept the PSVita as a "handheld PS3", and yet it has the same sort of hardware as any modern smartphone...

As others have said before, GPU advancements like PowerVR Rogue/6/RGX combined with the sort of gaming friendly stuff Apple are doing with iOS 7 and the Xbox Live/Steam type stuff Google are doing with Android are going to make mobile gaming a real competitor to traditional handhelds.

Sony and Nintendo will soon really have to step their game up (Nintendo less so, they've proven they know how to really differentiate themselves from competition and maintain support).
 

2San

Member
Total iOS/Android devices sold since 2007: 1.5 billion
Total gaming consoles sold ever (combining handhelds and consoles from all companies since the beginning of gaming): ~1.3 billion
This is a more apt comparison:
uaGv8Yc.png

Mobile is still doing pretty decent either way.
 

Sushen

Member
The thing is that we're still talking about $5 to $10 dollar games and the budget will dictate the kind of game you'll see on tablets not just GPU.
 

2San

Member
Yup, those YoY figures are interesting.

Where's the slice for handhelds? Bundled in with consoles?
Yeah it's bundled with consoles. Source
There's another image there they bundle handheld with tablets. >_< I think they really want to sell their reports, so they aren't showing too specific data.

edit: It says this though:

Game revenues generated by tablet and smartphones will gross 18% of the global games market, surpassing $12bn, which is roughly double the amount spent on games for handheld consoles.

So mobile gaming(including tablets) is actually double that of handheld gaming? :x
 

LukeTim

Member
The thing is that we're still talking about $5 to $10 dollar games and the budget will dictate the kind of game you'll see on tablets not just GPU.

Though, surely with a much larger potential market and a cheaper digital distribution system you can more easily afford to take risks with lower pricing.
 
Top Bottom