US, Yes. Europe, not really.Wasn't the Xbox more expensive as well? I don't know.
US, Yes. Europe, not really.Wasn't the Xbox more expensive as well? I don't know.
I didn't say the iPhone's success was predicated on its multifunctionality. It's clear that the iPhones success was to do with it's simply being in a league of its own compared with other devices available at the time.Appstores didnt but installing applications on symbian devices existed way before the apple rokur never mind iphone.
There were plenty smartphones out there that offered what the iphone did and more. If you think the iphone success has anything to do with multifunctionalty over touch you are misguided.
Touch and only touch is what made iphone what it is today. Kinect cannot and will not have the same impact.
Boom.gif
But they did have time to have their UE3 support the Wii U!It doesn't support the WiiU because Epic has no time for that piece of plastic.
I would believe that if their deductions of what the power of the console was or the significance of the specs wasn't so incorrect.
The majority of them have no idea their is a difference between GDDR and DDR
They are the minority. The majority do appreciate people like you giving us information like this.
Ignore the haters!
Touch and only touch is what made iphone what it is today. Kinect cannot and will not have the same impact.
Touch and only touch is what made iphone what it is today. Kinect cannot and will not have the same impact.
Yeah I can vouch for that
It was Microsoft with the Xbox360 who first employed a unified memory architecture. The PS3 had two separate 256 MB pools.Sony went unconventional and pooled it. MS also went ahead and pooled it, but they went cheap (poor design), and then they tried to circumvent that shortcoming by tacking an ESRAM to an already huge AUP chip (poor design) making it even bigger.
Oh shit.He confirmed this rumor off site. I can vouch for that. Bruce can too.
Could the difficulty in making chips in any way lead to unstable hardware? This stuff is scary based on their track record.
I really hope this isn't true. I know in reality that a small downclock won't make a huge difference in terms of graphics but it will widen the performance gap between Xbox One and PS4. Sooner or later you will have multi platform games where the lead will be on PS4 and they will be forced to gimp it down to Xbox One.
That will be a terrible day.
1 TF at 1080p. What happens when you down res, lower the effects (which are already lowered on PS4), and cut out some of the fancy expensive tesselating features?
a dog barf,
He confirmed this rumor off site. I can vouch for that. Bruce can too.
It was Microsoft with the Xbox360 who first employed a unified memory architecture. The PS3 had two separate 256 MB pools.
If this all is really true, I'm hoping at least they go the iPad route and release a new hardware revision every 2 years or so, so that this won't be a major problem going forward. For example, maybe in 2015 they release an Xbox One with 12 GB DDR5, and a 2+ TF GPU.
Updated the OP by removing some questionable sources and adding new confirmation. Still on page 10 so I might add more before I finish the thread!
I didn't say the iPhone's success was predicated on its multifunctionality. It's clear that the iPhones success was to do with it's simply being in a league of its own compared with other devices available at the time.
But that has nothing to do with the fact that iPhone was clearly conceived and sold as an all-in-one device. And the fact that it is both an all-in-one device and best-in-class had made me hope that the Xbone could be the same thing. It's clear, however, that this thing is more Nokia N91 than iPhone.
It's not modest. It's just poorly designed. It just keeps reminding me more and more of Cell.
They are the minority. The majority do appreciate people like you giving us information like this.
Ignore the haters!
So the real question is MS can't afford this, not in a cost perspective - but in PR, being like W8 and they can't have another RROD issue.
So wouldn't this be he best time to start figularitvely "from scratch" with this system. It allows the tool pipelines to be in place for developers, allow them to at least attempt to have a message about indies and show a little "hey guys we messed up" even if your trying to spin it back to MS message.
After all it isn't where you start it is where you end up.
I believe the N64, Saturn and PS3 are the consoles people point at with the "curse."
No. Sony designers showcased their vision in a recent Japanese website interview, they clearly stated that they would switch to 8x 8gbit chips when they become available.
For everyone asking- this information is all pretty recent. Around the PlayStation Meeting the Xbox One was way behind (OS + hardware). Engineers were scrambling to get things sorted out.
It turns out, they didn't sort it out. The OS you saw was a complete and total lie. The current plan is to get the yields up, lower the clock rate, and to have enough units out for a sell out in the Fall.
For those asking how this affects performance- to be perfectly frank; it is nothing turning down features won't solve. The mass market will never notice a difference between 1080p and 900p; neither will they care about dynamic shadows / global illumination / or tesselation. Go to your PC - and turn shadows from Ultra to medium, disable tesselation, and lower the resolution to 900p; and you'll find games run totally fine.
Microsoft is purely behind and it's now time to make drastic decisions. I don't think any one is happy about the lower clocks, but no one is depressed about it either. The Xbox One is an all-in-on device; and that's how it will be marketed.
Unfortunately it's not. There are performance issues. In some cases, its quite significant.
He confirmed this rumor off site. I can vouch for that. Bruce can too.Thuway, Bruce, Matt, Gopher all seem legit so this may not be needed but CBoaT if you're around...
Yeah I can vouch for that
So what is this Curse?
It was Microsoft with the Xbox360 who first employed a unified memory architecture. The PS3 had two separate 256 MB pools.
So what is this Curse?
NES ~ 61.91 million , SNES ~ 49.10 million , N64 32.93 million just a steady decline as the competition got tougher & the 4th console the Gamecube sold ~ 21.74 million continuing the Nintendo decline.
Master System ~ 1013 million , Genesis ~ 40 million , Saturn ~ 9.5 million curse of what? not being the Genesis? because every other Sega console sold around the 10 million mark even Dreamcast ~ 10.6 million
PS1 ~ 102.49 million , PS2 ~ 155 million , PS3 ~ 80 million & going it's going to end up right around the same numbers as the PS1.
That's an easy one. Selling less than your predecessor.So what is this Curse?
So what is this Curse?
M°°nblade;61392541 said:That's an easy one. Selling less than your predecessor.
Thats coming from someone who has put in thousands of dollars in the Xbox 360 ecosystem.
Well, MS already did the pooled UMA design thing with X360 while adding a fast EDRAM buffer that benefits more than hurts, AFAICT, just looking released titles over the years. Just looking at the stuff rumors and leaks have revealed, MS, having a longview with their DX roadmap and experience from their own studios, seems to have designed their hardware to address the problem of feeding units work data all of the time as well as trying to wipe out most of the cost of cache being killed to feed commonly used data. That's seems to be a more direct approach than the more general purpose one Sony took, AFAICT, but I'm no hardware or graphics tech-head. MS, it seems to me, built their version of the core hardware both they and Sony share, to be potentially much greater with maximizing the benefit of partially resident textures that UE4, idTech 5/6, and CE3+ support. In a general way, Sony added and then maxed out their pipes to the single RAM pool and CPU/GPU caches, while MS also did much the same except for focusing on having a high-speed on-chip cache and adding more dedicated copy/load/store/compress units to ease the burden from the CPU for data movement. MS' solution seems more complicated, but when these newer APU designs' goals are accounted for, memory virtualization should make managing it a relative snap and probably mostly hidden from programmers unless they specify more micromanagement with prefetching and how data is laid out to be consumed in the 32MB cache. I don't see how ESRAM is a mistake when it's clear that it has major benefits for actually utilizing the hardware and trying to limit the loss going from potential performance to achieved performance. That's the same goal as building out a high bandwidth connection to GDDR5 RAM and adding more ways to access cache, bypass it, and move data through it. If Sony's approach was really just flat-out superior, why wouldn't MS have just gone with that considering that it's so similar to what they had in the X360 and that it would be less complex as well as less costly? MS sees the benefit and I'm not convinced that it's just a Rube Goldberg machine method of achieving the same goal because they're using DDR3 for their main work RAM.That's not really true. The conventional way of doing thins in the pc architecture is seperated VRAM and System RAM. Sony went unconventional and pooled it. MS also went ahead and pooled it, but they went cheap (poor design), and then they tried to circumvent that shortcoming by tacking an ESRAM to an already huge AUP chip (poor design) making it even bigger.
Perhaps it's a mistake, but we're not quite near launch yet, with less than six months or so to go. MS could just take a big hit on cost upfront by taking what they get out of current manufacturing and just eat the higher cost/lower yields until the process is smoothed out over the next year, but not necessarily downclocking to make target shipments and time window. It's not uncommon for new consoles to be in less than great shape so close to release, as MS was behind with X360 and was forced to demo in-progress games with less than ideal performance at E3 '05 with their beta kits, IIRC, which don't quite approximate the final hardware's speed and behavior completely. We could be looking at something very similar with X1. In any case, rumors are rumors, and even when they are right, they can miss a lot of important details that can make them seem more major than they really are. I'm not going to underestimate MS' plans and execution when they came out fine twice before.The problem is that the fabrication process they are using was (AFAIK) never tested for such a big die. They took a (stupid) gamble and bet their horses on achieving yields despite the die size inflation. My guess is they crossed their fingers and turned a blind eye to the risk.
Sony was always pretty good at the hardware side (other than cell being hard to program for).
Sucks to see this happen since multiplatform titles might be affected as a result... (of course they could program for the ps4 then downscale it to the nextbox).
However Microsoft's arrogance seems to be a facade for the real things going on behind the scenes.
And how quickly do you think this can be done? Something like this is probably talking two years minimum. By then PS4 would have an insurmountable lead and this gen would be lost.
If you're MS, you just put some lipstick on and spin it the best you can. We've already seen that with "5 Billion Transistors" and "Infinite Power of the Cloud." Spin, spin, and more spin. That's the only effective course. The used game DRM also makes sense. Because if Sony doesn't match this, it's one way to suddenly get EA to heavily favor your system.
Haha yeah, you'll be adding A LOT more.