Ah okay. I clearly misunderstood what I was reading then, thanks. Hope it is the first, and I can still disable it.
If it's dedicated there would be no performance hit so no need to disable it.
Ah okay. I clearly misunderstood what I was reading then, thanks. Hope it is the first, and I can still disable it.
If it's dedicated there would be no performance hit so no need to disable it.
Could be wrong but no. They have mostly software which enables use of gpu instead of cpu to encode/decode video.
Also this sharing thing makes me think what Sony is trying to do. Obvious logic point at youtube sharing, screenshots etc. But this could also mean that Sony will try to do something with it and introduce some app/virtal space something like HOME or HOME as native app. Maybe integrated with OS (as people wanted that years ago)
Could be wrong but no. They have mostly software which enables use of gpu instead of cpu to encode/decode video.
Also this sharing thing makes me think what Sony is trying to do. Obvious logic point at youtube sharing, screenshots etc. But this could also mean that Sony will try to do something with it and introduce some app/virtal space something like HOME or HOME as native app. Maybe integrated with OS (as people wanted that years ago)
The hardware seems absolutely awful. Not even a dedicated GPU. AMD will have to make the APU 30 times more powerful just to match my gaming PC.
If rumors are true then we have:
Display ScanOut Engine (DCE) - offloading image analys from CPU for use with improved PSEYE aka use of PSEYE without any real performance hit.
Joke post? As a long-time PC gamer (5.25 represent) - you cannot directly compare console/PC architecture. I'd give you reasons but they've been repeated dozens of times in this thread and, apparently, do not penetrate ignorance.The hardware seems absolutely awful. Not even a dedicated GPU. AMD will have to make the APU 30 times more powerful just to match my gaming PC.
Joke post? As a long-time PC gamer (5.25 represent) - you cannot directly compare console/PC architecture. I'd give you reasons but they've been repeated dozens of times in this thread and, apparently, do not penetrate ignorance.
But this time consoles will be using PC parts
But this time consoles will be using PC parts
not really. a lot less so than say, the original xbox. nothing about the durango architecture (what we know of it) sounds like PCs other than maybe the CPU. The PS4 with its UMA and 14+4 setup doesn't sound very comparable either.
Closed box versus open, hasnt changed.But this time consoles will be using PC parts
I like how everyone on GAF suddenly becomes an electrical engineer when it gets to system launch time.
I like how everyone on GAF suddenly becomes an electrical engineer when it gets to system launch time.
How am I playing smart? I'm just reacting with amusement because the same thing happens with spec speculation.What if you are actually an electrical engineer?
I am, but I am not very well versed in the details of the hardware being used. I rely on the great minds on GAF to break everything down. Everybody is skilled in different areas. If you don't really know you shouldn't be "playing" smart because most others can call your bluff.
"PC Parts"... ... oh dear.But this time consoles will be using PC parts
But this time consoles will be using PC parts
Applorange:
![]()
Applorange:
![]()
The great taste of an orange with no need to peel it. I need one of these.
Want.Applorange:
![]()
AMD will have to make the APU 30 times more powerful just to match my gaming PC.
Well scanout can do other processing than scaling (on PS2 it added support for two "display-planes" akin to fairy-dust talked about for Durango), but I'm with you that I don't see how video/frame-analysis would fit into any of it.Argyle said:Just going by the name, this is just a scaler on the video output. "Scanout" is the process of outputting the front buffer to the video output.
EDIT: Mind you the Durango still has that 32mb of ram, but it still has a lower bandwidth than that GDDR5. Sure the latency (time to be accessed) is lower, but for GPU processes, that's irrelevant.
What if you are actually an electrical engineer?
I am, but I am not very well versed in the details of the hardware being used. I rely on the great minds on GAF to break everything down. Everybody is skilled in different areas. If you don't really know you shouldn't be "playing" smart because most others can call your bluff.
They made an article based on a forum post? Lol.
Exactly. They're taking shit that people might have made up and making them into news stories.
Well, I got one for them. I know exactly what's going to happen on Feb 20. Sony is going all streaming. Everything is gonna be based on Gaikai tech.
I'm looking forward to having my name posted on IGN. LOL!
It's a new IP for definite. I don't want to say more than that because we're only a few months away and it'll be much more impressive to have a full reveal. It'd be unfair to GG, and this is going to be a really huge event for them. Plus I know very little, and it's no point putting it out there. It' be more detrimental than anything else. I can tell you it's not like Killzone.
I can spill the following comfortably though since I think this developer is a bit of a joke, and I think people are beginning to suspect this anyway. Versus XIII will have another unveiling later this year regarding development for the next generation. It won't have the "Versus XIII" part in the title. It will definitely release in 2014. You can have my account deleted if this information turns out to be inaccurate. News is fresh on this and I'm really confident about it. FF15 and FF16 are underway, both next generation projects. There's a chance FF15, in particular, may be an exclusive release. Kind of. And with that, I'm off to sleep.
closed box, everything on die, custom design, different take on different bottlenecks.
As timothy (developer of FXAA) said PC can generate 10-100x more overhead than consoles. And consoles can directly use low api where PC just use high api.
If 680 would be in next generation cosoles i doubt any hardware in 3-5 years would match consoles because efficiency is just better on consoles.
PC main problem is that it just can't get to full power of hardware. Theoretical output can be the same but practical one is vastly different.
He is probably talking about the X64 architecture, know I don't really know anything about tech but it will be really interesting to see how console>PC ports play out next gen, specially if it's true that the Xbox will use a DDR3 memory.
Don't take my word as fact. Until 20 of February we can only speculate.
If rumors are true then we have:
Audio Processor (ACP) - DSP hardware to offload CPU with audio work
Video encode and decode (VCE/UVD) units - for video encoding/decoding on fly and screen capture as mentioned in EDGE
Display ScanOut Engine (DCE) - offloading image analys from CPU for use with improved PSEYE aka use of PSEYE without any real performance hit
Zlib Decompression Hardware - helps reading from HDD probably being more efficient than fastest HDD (but still slower than SSD)
I really love design choice which Sony made if rumors are true. Thought i will change HDD to SSD
I wonder whether Zlib decompression hardware will be used in engine as well.
All I know is that people will question reality when GT6 is unveiled.
I wonder whether Zlib decompression hardware will be used in engine as well.
Then people will question PD on why Turn 10 delivers something better in a shorter amount of time.
The circle of life (..errr...video games).
in what engine?