2blackcats
Member
I bet the OS could run alot faster if we could hide/delete/organize everything we wanted.
Just folders for what we want right away and library for everything else.
I see you've been to OS college.
I bet the OS could run alot faster if we could hide/delete/organize everything we wanted.
Just folders for what we want right away and library for everything else.
I'm afraid you're confusing Asynchronous Compute with running a full-blown OS.
A modern x86-64 processor has tons of features that a GPU doesn't have and it supports over 1000 different instructions. A GPU/SPU is specialized at certain tasks (mainly linear algebra/matrix multiplication). Its feature/instruction set is limited compared to a CPU and that's why they're able to process many Teraflops. It's a different design philosophy, depending on what you want to do with a limited transistor budget. A CPU is a jack of all trades and master at none, while the GPU/SPU is a specialized, streaming processor.
Shaders have nothing to do with traditional x86/CPU code.
Shaders are made to process graphics, physics and parallelizable stuff in general.
Well, apart from the fact that it's just not very good at it, potentially, missing circuitry. I don't know enough to say for sure, but I wouldn't be at all surprised if the GPU simply lacked the transistors required to perform vital operations, because why would you waste silicon on functionality nobody will ever call? So maybe it can run the entire OS and maybe it can't, but what difference does it really make if it's a terrible idea to begin with?
That's what I was trying to get across in my hQ explanation, actually. It's dumb to force your decision-maker to waste all of its time crunching numbers, but it's equally stupid to make your number-cruncher try to make decisions. That's the advantage of GCN. Everything is designed around the idea of letting the appropriate chip handle the appropriate function, and letting the to chips work as independently from one another as possible. Running the entire OS on the GPU is just as dumb as running the entire OS on the CPU.
Having information like visibility magically update for the CPU is a huge fucking win, so we should be talking about stuff like that, or what devs are gonna run on this extra CPU core, instead of bickering about moot points like running the OS entirely on the GPU. Yes?
Fight!!
I have the suspect you don't have understand at all what the quote talking about. You can't do all the CPU stuff with GPGPU. It talks of particle, fluid basically the physics stuff."Norden also revealed an extended Direct X 11.1 featureset, a new asynchronous compute architecture, "greatly expanded shaders", and features "beyond" Direct X 11, OpenGL 4.0, and OpenCL. Sony's Compute allows developers to run arbitrary code on the GPU instead of the CPU, which benefits fluids, particles and post processing. Compute has access to the system's full set of "very expensive, very exotic" 8GB 256-bit GDDR5 unified memory.
Look above you see where it says that it is beyond OpenCL?
I'm not the one who brought GPU's running a OS I was just pointing out that it could.
Hopefully third parties get the whole core relatively soon.
I still want to know if the RAM reserve has opened up at all yet.
I'm not sure how feasible it is but, in light of this, I hope some devs patch their previously released titles to help even out framerates.
This video explains the difference between serial (CPU) and parallel (GPU) processing:
https://www.youtube.com/watch?v=-P28LKWTzrI
I don't understand why you persist to substain something of so absurd. That's not what your quote talking about. Seriously.I know the difference & that's only telling you what they are good at but there is nothing stopping the GPU in the PS4 from being able to run a OS.
Probably he believes to obtain more CPU resources with less sacrifices, using the 'more powerful' gpu. I imagine.Where in the hell does this line of reasoning or idea come from?
Why would anyone want to use precious GPU resources to handle OS functions? Even if one could.
Where in the hell does this line of reasoning or idea come from?
Why would anyone want to use precious GPU resources to handle OS functions? Even if one could.
All that power
Where in the hell does this line of reasoning or idea come from?
Why would anyone want to use precious GPU resources to handle OS functions? Even if one could.
No one wants to run a OS on a GPU I was just telling him that it could.
I note that using the media player doesn't kill the game, only suspends it, so that 3GB includes enough memory to run non-game apps. I also note that the bone reserves a similar amount, and it has a similar, OS + App + Game model. Note that I am not noting this in a system wars manner, just that it's a comparable device doing comparable things.I imagine it will if it already has not. 3GB reserved seems kind of excesive.
https://s-media-cache-ak0.pinimg.com/originals/96/d2/f7/96d2f7f1aa8250b3a1925521d3ae70fa.gif [img][/QUOTE]
Modern GPUs and GPGPU programming languages are turning complete. So, in principle, you could run a very simplistic "OS" on a GPU, ignoring hardware-specific issues like the lack of a ring protection model and (I assume) missing features in the areas of memory and interrupt management.
But it would be fucking slow and useless.
All that power
I've been complaining about these massive reserves since we heard about them. 3gb is absurd, especially with the featureset of the PS4's.I imagine it will if it already has not. 3GB reserved seems kind of excesive. I am currently using only 2,4 on Win 10 with multiple tabs open and a full OS at my finger tips.
I'm sure it will before long. First parties are likely "beta testing" the full core update.not happy about sony keeping the seventh core full power exclusive to ps4. come on sony let third part'ys go super saiyan as well.
fate/stay night [unlimited blade works]What anime is this?
Thank you good sir. Looks good.fate/stay night [unlimited blade works]
Wut?not happy about sony keeping the seventh core full power exclusive to ps4. come on sony let third part'ys go super saiyan as well.
You need a receipt to know that a general purpose processor can do general purpose processing?
Wut?
But it's not.
You need a receipt to know that a general purpose processor can do general purpose processing?
Just to make it clear:
Every computing device that features at least
{add,sub}
combined with the boolean operators
{and,or,not}
can run arbitrary code, yes. Or perhaps let's say you can depict every boolean function with these. But OS is not only software but also relies on a lot of things like access to memory areas reported by BIOS, a dedicated boot loader, processor rings are also inherently needed and also OS's are not designed that way to boot off something different than a dedicated main processor.
For science it could be an interesting project but we won't see it on PS4, ever.
As interesting as this is (and I hate to be that person), but can you make a new thread for this random topic or something? When I see that this thread is updated I kind of hope that it's talking about the 7th core and not about running an OS on the GPU or whatever.
I never said it would happen on PS4 or should happen on PS4, someone said it was impossible & I pointed out that it could.
Think you got the wrong thread. I was just reading that one, too, then came in this one. Confusing.EDIT: wrong thread.
a little on the subject, its neat they let devs have access, im hoping for more stable games
Think you got the wrong thread. I was just reading that one, too, then came in this one. Confusing.
I don't understand the potential, but can someone post it in Dragonball terms?
Yes it is
"Sony's Compute allows developers to run arbitrary code on the GPU instead of the CPU"
With the PlayStation 4, its even such things as the share cores have a beautiful instruction set and cant be programmed in assembly. If you were willing to invest the time to do that, you could do some incredibly efficient processing on the GPU for graphics or for anything else. But the timeframe for that kind of work would not be now. I dont even think it would be three years from now.
Yes, arbitrary code... assuming the instructions in the code CAN be processed by the GPU. GPUs aren't equipped to deal with running an ENTIRE OPERATING SYSTEM on them alone. OSs have a bunch of branching involved, and GPUs aren't made for that.
When you read "cheap branching" compare it to GPUs before GCN, don't think as a general rule or on the same level like CPUs.
Lol, dude yes they were. Cell was a completely different beast, the PPE was used to send out as much task to the SPEs thereby increasing performance. There have been plenty of Naughty dawg and insomniac videos and articles about how they used Cell over the years. They were very proud of it, in any case the PS3 OS ran off the 7th SPE.
Exactly.No need to be uppity; I think the implied question above was if the OS also used the PPE and had partial reservation, in addition to part of the OS running on an SPU.
If it /COULD/ run solely on an SPU is a different question, but it would be even slower and crappier than it already is. The SPUs were great at SIMD, to the expense of everything else, and that's not what an OS is sitting there doing all day. An OS with no L2 cache [it had local memory, but then you'd have to hand manage every bit of OS memory that should be there for performance] or branch prediction...Sounds nightmarish.
Exactly.
A dev said that it ran the OS and the game in separate soft threads (PPE 1.6 GHz + PPE 1.6 GHz*).
* "Well, as we understand it, the Xbox 360 CPU has two threads per core, and that threading is handled in a fairly primitive manner, with the processor effectively working as six 1.6GHz threads."
Source: http://www.eurogamer.net/articles/digitalfoundry-2015-vs-backwards-compatibility-on-xbox-one
Jaguar runs at 1.6 GHz, but we now have 8 cores and its microarchitecture is way more advanced (out-of-order execution/better branch prediction vs PPE in-order/poor branch prediction).
That's why I don't understand why people are so quick to dismiss Jaguar, while the Cell PPE is not that great either. The SPU part was its only saving grace, but as you said, that's limited to SIMD operations.
Maybe this could explain it: https://en.wikipedia.org/wiki/Megahertz_myth
I mean, if ND was able to run TLOU's NPC AI in PPE @ 1.6 GHz, then imagine what they can possibly do with 7 Jaguar cores @ 1.6 GHz.
TL;DR: Cell is not that great as people think it is (minus its SPU/flops grunt) and yet it served the PS3 for almost a decade. There's no reason to think that the PS4 won't be able to last until 2018-2020.
Because a Pentium 4* from 2004 was able to beat it in most benchmarks (minus the SIMD/flops benchmarks, of course).
* Mind you, NetBurst wasn't exactly the most efficient x86 microarchitecture.
SIMD is not suitable for everything as you think it is. It's like having a super-smart kid who is really good at maths (linear algebra/matrix multiplication), but pretty much dumb at everything else (history, biology, literature etc.) That's what the Cell SPU/GPGPU is. It excels in one field, but it's useless in everything else.