• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

vg247-PS4: new kits shipping now, AMD A10 used as base, final version next summer

Activision and Microsoft are really close. Sony has probably been playing it as close to the chest as possible so it would be kind of a risk to let anyone they don't have a great relationship with.

But that was last gen. The exclusivity is over. With the new gen you'd think they would like to start fresh.
 

Massa

Member
did they change the name so people wouldn't start figuring out they make awful games?

No, they changed their name because they realized trying to grow a studio by making console games for big publishers is a losing proposition so they're chasing greener pastures in the mobile space, like most other independent developers.
 

Kagari

Crystal Bearer
Almost assuredly.

Edit: I haven't spoken to many people working for the Japanese publishers, because they tend to keep their western staff in the dark as much as possible. More than you'd think possible. That said, I think it's a safe bet that Luminous was built with the PS4/Orbis in mind.

They did seem to hint toward that when I spoke to the Luminous team this past E3.
 

RoboPlato

I'd be in the dick
But that was last gen. The exclusivity is over. With the new gen you'd think they would like to start fresh.
MS will never let go of their CoD DLC timed exclusivity until the bubble has burst. Sony also has had the BF DLC timed exclusivity for a while, which I assume Activision wouldn't be too happy about. Activision just doesn't seem like a company that Sony would be able to trust if they're trying to control leaks as much as possible because Microsoft could just spend enough money to out do thrm. They did a good job up until now but MS is likely done with their silicon so it doesn't matter anymore.
 

androvsky

Member
That's actually going to make things interesting. There will be a "clear winner" in multiplats as a result.

More likely a "clear winner" per multi-platform engine, and I think we'll all pray there's more than one popular one at the beginning of the generation.

And I think the last thing any third party publisher wants is a clear winner. All the platform holders (including Sega) have a history of being jerks when they're winning, or even feel like they're winning.
 
They did seem to hint toward that when I spoke to the Luminous team this past E3.

Bueno.

More likely a "clear winner" per multi-platform engine, and I think we'll all pray there's more than one popular one at the beginning of the generation.

And I think the last thing any third party publisher wants is a clear winner. All the platform holders (including Sega) have a history of being jerks when they're winning, or even feel like they're winning.

Well, if Sony is going with a higher bandwidth ram (as most rumors are pushing) than I see that as being the best one... Not to mention rumors of lower cores at higher clock speeds.

No one rights code for 8 cores. It's just tedious.
 
No, they changed their name because they realized trying to grow a studio by making console games for big publishers is a losing proposition so they're chasing greener pastures in the mobile space, like most other independent developers.

makes sense, besides, their reputation among gamers was destroyed. good idea to start anew somewhere else.
 

aegies

Member
They did seem to hint toward that when I spoke to the Luminous team this past E3.

I mean, when I talked to them, they gave the standard "we didn't want to focus on one platform" line. But I don't see this as the generation where japanese developers suddenly start to code for the american platform or even primarily target it unless it's the only thing out there.
 
MS will never let go of their CoD DLC timed exclusivity until the bubble has burst. Sony also has had the BF DLC timed exclusivity for a while, which I assume Activision wouldn't be too happy about. Activision just doesn't seem like a company that Sony would be able to trust if they're trying to control leaks as much as possible because Microsoft could just spend enough money to out do thrm. They did a good job up until now but MS is likely done with their silicon so it doesn't matter anymore.

It makes me wonder if Sony's going to make a hard push to secure some sort of exclusive content for Respawn's new series. That's going to be a very big deal whenever it's unveiled.
 

Massa

Member
It makes me wonder if Sony's going to make a hard push to secure some sort of exclusive content for Respawn's new series. That's going to be a very big deal whenever it's unveiled.

I think that's going to be all on EA. Sony or Microsoft would get weird looks from Activision if they got in bed with Respawn.
 

Kagari

Crystal Bearer
I mean, when I talked to them, they gave the standard "we didn't want to focus on one platform" line. But I don't see this as the generation where japanese developers suddenly start to code for the american platform or even primarily target it unless it's the only thing out there.

Probably not no. And in the case of someone like SE, not as long as SCE continues their involvement.
 

aegies

Member
Bueno.



Well, if Sony is going with a higher bandwidth ram (as most rumors are pushing) than I see that as being the best one... Not to mention rumors of lower cores at higher clock speeds.

No one rights code for 8 cores. It's just tedious.

At the start of this generation, no one wrote for two cores - see Carmack's statement about the 360 being better served by a single core P4 when the 360 launched. Now a quad-core processor is considered a must for PC games, which are being designed around almost 8 year old hardware. PC developers have been waiting for this. It's happening. There's no dispute that massively parallel computing is where CPUs are going.
 
Isn't it too late for secrets now? The system architecture should be set in stone, way too late to make any changes?

But Sony's strategy seems good to me.
 

NBtoaster

Member
Bueno.



Well, if Sony is going with a higher bandwidth ram (as most rumors are pushing) than I see that as being the best one... Not to mention rumors of lower cores at higher clock speeds.

No one rights code for 8 cores. It's just tedious.

360 devs write code for 6 threads. PS3 devs write code for 6 SPUs . It's not that much more.
 

i-Lo

Member
They did seem to hint toward that when I spoke to the Luminous team this past E3.

With Luminous requiring 1.8GB vram to run (at least as it was during E3) I am certain it was not being built around PS4 when it was rumoured to have 2GB GDDR5 overall. Given luminous engine had been in development for year back in 2012, it makes me wonder whether the shift to 4GB (and now 8.. but most assuredly DDR3) was talked at length much early on. I refuse to believe they just had a change of heart a few months ago and decided to allegedly release two dev kits with 8 and 16GB ram configs.

Btw, would you happen to know if Sony's chum, EA was one of the third parties linked to helping Sony with PS4's spec development?

The reason I'm not putting too much stock in Sony's first parties when it comes to aiding in development is because despite their existence PS3 was spec'd the way it was, i.e. cell processor and split pool (skyrim skyrim). So how could they possibly aid in understanding the pains of the third parties when all they'll ever have had to do is develop for PS3...?
 

Gorillaz

Member
Sony and EA have always been BFF for some reason. Thats why I see them definitely talking to EA and even SE the most.

Also Respawn exclusivity has always been in the back of my head. Idk.....
 
The reason I'm not putting too much stock in Sony's first parties when it comes to aiding in development is because despite their existence PS3 was spec'd the way it was, i.e. cell processor and split pool (skyrim skyrim). So how could they possibly aid in understanding the pains of the third parties when all they'll ever have had to do is develop for PS3...?

haha Ken Kuturagi listened to no one, both the PS2 and PS3 had a terrible architecture. Krazy Ken just did whatever he wanted and devs had to suffer. PSVita and PS4 are much different.
 

RoboPlato

I'd be in the dick
I don't think that relationship is in very good shape. And there are other sony partnerships with publishers that have seen friction over the last six months.
Ah, hadn't heard about this. Still, I could see Sony trying to hold on to their BF stuff.
 

stryke

Member
The reason I'm not putting too much stock in Sony's first parties when it comes to aiding in development is because despite their existence PS3 was spec'd the way it was, i.e. cell processor and split pool (skyrim skyrim). So how could they possibly aid in understanding the pains of the third parties when all they'll ever have had to do is develop for PS3...?

Look to how Vita came about and I think that gives an indication of Sony's attitude towards hardware development and the opinions of developers.
 

Kagari

Crystal Bearer
With Luminous requiring 1.8GB vram to run (at least as it was during E3) I am certain it was not being built around PS4 when it was rumoured to have 2GB GDDR5 overall. Given luminous engine had been in development for year back in 2012, it makes me wonder whether the shift to 4GB (and now 8.. but most assuredly DDR3) was talked at length much early on. I refuse to believe they just had a change of heart a few months ago and decided to allegedly release two dev kits with 8 and 16GB ram configs.

Btw, would you happen to know if Sony's chum, EA was one of the third parties linked to helping Sony with PS4's spec development?

The reason I'm not putting too much stock in Sony's first parties when it comes to aiding in development is because despite their existence PS3 was spec'd the way it was, i.e. cell processor and split pool (skyrim skyrim). So how could they possibly aid in understanding the pains of the third parties when all they'll ever have had to do is develop for PS3...?

Specific names weren't really given, but the way I imagine it was similar to how the Vita came about.
 

androvsky

Member
The reason I'm not putting too much stock in Sony's first parties when it comes to aiding in development is because despite their existence PS3 was spec'd the way it was, i.e. cell processor and split pool (skyrim skyrim). So how could they possibly aid in understanding the pains of the third parties when all they'll ever have had to do is develop for PS3...?

I believe Sony execs have explained in detail the way consoles where designed at Sony before the Vita. Kutaragi got his engineers together, made some crazy fast hardware, and threw it at the developers. If the "deal with it" meme had been popular in 2005, there probably would have been one taped to the prototype kits.
 
I don't think that relationship is in very good shape. And there are other sony partnerships with publishers that have seen friction over the last six months.

Does that have to do with specs or Sony keeping them in the dark about stuff .
Rather interesting that Sony seem to be having problems with 3rd parties that they seem close with .

It seem you not hearing allot of good things about Sony first about specs now this .
 

i-Lo

Member
I don't think that relationship is in very good shape. And there are other sony partnerships with publishers that have seen friction over the last six months.

What has led to this friction?

Look to how Vita came about and I think that gives an indication of Sony's attitude towards hardware development and the opinions of developers.

Specific names weren't really given, but the way I imagine it was similar to how the Vita came about.

I believe Sony execs have explained in detail the way consoles where designed at Sony before the Vita. Kutaragi got his engineers together, made some crazy fast hardware, and threw it at the developers. If the "deal with it" meme had been popular in 2005, there probably would have been one taped to the prototype kits.

So what you're saying is that Vita showcases that the hardware side for consoles no longer operates in an arrogant vacuum. Be that as it may, afaik, Vita's only true customization is stack-cat-ing ddr chips (inevitable bespoke motherboard notwithstanding) with all else being off the shelf parts. From the rumours of Durango, they are heavily customizing their hardware to eke out maximum attainable performance within the limited scope of the hardware without making life for devs difficult. Would PS4, which a more macro, a home console abide by Vita's philosophy of off the shelf parts being "taped" (hyperbole but you get the point) together for PS4 without any significant tweaking to keep things amicable with (third party) devs?
 

meta4

Junior Member
Does that have to do with specs or Sony keeping them in the dark about stuff .
Rather interesting that Sony seem to be having problems with 3rd parties that they seem close with .

It seem you not hearing allot of good things about Sony first about specs now this .

Aegies is not going to have anything positive to hear or tell about Sony.
 

aegies

Member
Does that have to do with specs or Sony keeping them in the dark about stuff .
Rather interesting that Sony seem to be having problems with 3rd parties that they seem close with .

It seem you not hearing allot of good things about Sony first about specs now this .

From the outside looking in, stuff inside sony seems kind of in upheaval right now. Granted, microsoft has been running on autopilot since the end of last year.

Again, I'm not hearing bad things about PS4 specs. I just hadn't heard much of anything, and what I did hear made it all seem very up in the air.
 

patsu

Member
So what you're saying is that Vita showcases that the hardware side for consoles no longer operates in an arrogant vacuum. Be that as it may, afaik, Vita's only true customization is stack-cat-ing ddr chips (inevitable bespoke motherboard notwithstanding) with all else being off the shelf parts. From the rumours of Durango, they are heavily customizing their hardware to eke out maximum attainable performance within the limited scope of the hardware without making life for devs difficult. Would PS4, which a more macro, a home console abide by Vita's philosophy of off the shelf parts being "taped" (hyperbole but you get the point) together for PS4 without any significant tweaking to keep things amicable with (third party) devs?

Sony also customized Vita GPU together with Imagine Tech. That's why the GPU part number has an extra "+".

I seem to recall they optimized the setup for gaming (e.g., maximize throughput/bandwidth).
 
At the start of this generation, no one wrote for two cores - see Carmack's statement about the 360 being better served by a single core P4 when the 360 launched. Now a quad-core processor is considered a must for PC games, which are being designed around almost 8 year old hardware. PC developers have been waiting for this. It's happening. There's no dispute that massively parallel computing is where CPUs are going.

Of course, I mean, programming for the Cell was pretty much hated by Carmack AND Newell. But they love it now. The thing is though, 6-8 cores of x86 isn't the same as 6-8 SPU's for cell.

Let's not forget how slow it is, not only that but Amdahl's law is in play. Greater amount cores don't increase performance unless processes can be broken down into extremely small tasks (at which a GPGPU is more suitable). Otherwise parallel processing is best with a relatively "low" amounts of cores.
 

patsu

Member
It probably depends on the application area.

Mobile CPU is not really going into massively parallel realm. They are going to be more power efficient.

Server CPU is going into more cores, and GPU of course has even more cores due to its job nature.
 
It probably depends on the application area.

Mobile CPU is not really going into massively parallel realm. They are going to be more power efficient.

Server CPU is going into more cores, and GPU of course has even more cores due to its job nature.

Well of course, but I can't comprehend what type of job processes that would benefit from their setup. I guess 2 cores would be OS and other multitasking processes and then 6 for gaming. But as the rumors have been saying, low clock speeds.
 

onQ123

Member
I hope the PS4 & Xbox Next support Touch Screen Monitors because I can see that being a big feature Next Gen especially with all the Tablets & new All In One PCs with touch controls, & with the Wii U having it's own touch screen tablet,

Sony VAIO Tap 20


Lenovo's Horizon Tablet



but then again it might not be needed because of Smart Glass & Remote Play. better yet let the consoles serve high end touch control games to your Tablets & Smartphones.



(starts to save up for 12" 4K tablet)
 
I hope the PS4 & Xbox Next support Touch Screen Monitors because I can see that being a big feature Next Gen especially with all the Tablets & new All In One PCs with touch controls, & with the Wii U having it's own touch screen tablet,

Sony VAIO Tap 20


Lenovo's Horizon Tablet



but then again it might not be needed because of Smart Glass & Remote Play. better yet let the consoles serve high end touch control games to your Tablets & Smartphones.



(starts to save up for 12" 4K tablet)
I hope Sony lets them use touch interface via Viao's and Vita (lots of v's there) to quickly control the UI.
 
Eric Mejdrich Sr Director of SOC Architecture and Principal Architect Xbox at Microsoft

Some of the following have been discussed by Sony Game engine developers for the PS4. It's possible that some may show up in both consoles to allow software developers an easier port between platforms.

In any case I believe some of the following are the special sauce being discussed for the Xbox 3. The file dates for hardware features are all by Dec 2011 which is when SemiAccurate stated the Xbox 720 started tapeout.

http://www.faqs.org/patents/inventor/eric-o-mejdrich-2/

QOS software

20110285709 Allocating Resources Based On A Performance Statistic - A method includes rendering an object of a three dimensional image via a pixel shader based on a render context data structure associated with the object. The method includes measuring a performance statistic associated with rendering the object. The method also includes storing the performance statistic in the render context data structure associated with the object. The performance statistic is accessible to a host interface processor to determine whether to allocate a second pixel shader to render the object in a subsequent three-dimensional image. 11-24-2011

20110285710 Parallelized Ray Tracing - A method includes assigning a priority to a ray data structure of a plurality of ray data structures based on one or more priorities. The ray data structure includes properties of a ray to be traced from an illumination source in a three-dimensional image. The method includes identifying a portion of the three-dimensional image through which the ray passes. The method also includes identifying a slave processing element associated with the portion of the three-dimensional image. The method further includes sending the ray data structure to the slave processing element. 11-24-2011

20110289485 Software Trace Collection and Analysis Utilizing Direct Interthread Communication On A Network On Chip - Collecting and analyzing trace data while in a software debug mode through direct interthread communication (‘DITC’) on a network on chip (‘NOC’), the NOC including integrated processor (‘IP’) blocks, routers, memory communications controllers, and network interface controllers, with each IP block adapted to a router through a memory communications controller and a network interface controller, where each memory communications controller controlling communications between an IP block and memory, and each network interface controller controlling inter-IP block communications through routers, including enabling the collection of software debug information in a selected set of IP blocks distributed through the NOC, each IP block within the selected set of IP blocks having a set of trace data; collecting software debugging information via the set of trace data; communicating the set of trace data to a destination repository; and analyzing the set of trace data at the destination repository. 11-24-2011

Software acceleration

20110316855 Parallelized Streaming Accelerated Data Structure Generation - A method includes receiving at a master processing element primitive data that includes properties of a primitive. The method includes partially traversing a spatial data structure that represents a three-dimensional image to identify an internal node of the spatial data structure. The internal node represents a portion of the three-dimensional image. The method also includes selecting a slave processing element from a plurality of slave processing elements. The selected processing element is associated with the internal node. The method further includes sending the primitive data to the selected slave processing element to traverse a portion of the spatial data structure to identify a leaf node of the spatial data structure. 12-29-2011

20110316864 MULTITHREADED SOFTWARE RENDERING PIPELINE WITH DYNAMIC PERFORMANCE-BASED REALLOCATION OF RASTER THREADS - A multithreaded rendering software pipeline architecture dynamically reallocates regions of an image space to raster threads based upon performance data collected by the raster threads. The reallocation of the regions typically includes resizing the regions assigned to particular raster threads and/or reassigning regions to different raster threads to better balance the relative workloads of the raster threads. 12-29-2011

20110317712 Recovering Data From A Plurality of Packets - A method includes receiving a plurality of packets at an integrated processor block of a network on a chip device. The plurality of packets includes a first packet that includes an indication of a start of data associated with a pixel shader application. The method includes recovering the data from the plurality of packets. The method also includes storing the recovered data in a dedicated packet collection memory within the network on the chip device. The method further includes retaining the data stored in the dedicated packet collection memory during an interruption event. Upon completion of the interruption event, the method includes copying packets stored in the dedicated packet collection memory prior to the interruption event to an inbox of the network on the chip device for processing. 12-29-2011

Hardware patent

20110320719 PROPAGATING SHARED STATE CHANGES TO MULTIPLE THREADS WITHIN A MULTITHREADED PROCESSING ENVIRONMENT - A circuit arrangement and method make state changes to shared state data in a highly multithreaded environment by propagating or streaming the changes to multiple parallel hardware threads of execution in the multithreaded environment using an on-chip communications network and without attempting to access any copy of the shared state data in a shared memory to which the parallel threads of execution are also coupled. Through the use of an on-chip communications network, changes to the shared state data may be communicated quickly and efficiently to multiple threads of execution, enabling those threads to locally update their local copies of the shared state. Furthermore, by avoiding attempts to access a shared memory, the interface to the shared memory is not overloaded with concurrent access attempts, thus preserving memory bandwidth for other activities and reducing memory latency. Particularly for larger shared states, propagating the changes, rather than an entire shared state, further improves performance by reducing the amount of data communicated over the on-chip communications network. 12-29-2011

20110320724 DMA-BASED ACCELERATION OF COMMAND PUSH BUFFER BETWEEN HOST AND TARGET DEVICES - Direct Memory Access (DMA) is used in connection with passing commands between a host device and a target device coupled via a push buffer. Commands passed to a push buffer by a host device may be accumulated by the host device prior to forwarding the commands to the push buffer, such that DMA may be used to collectively pass a block of commands to the push buffer. In addition, a host device may utilize DMA to pass command parameters for commands to a command buffer that is accessible by the target device but is separate from the push buffer, with the commands that are passed to the push buffer including pointers to the associated command parameters in the command buffer. 12-29-2011

20110321049 Programmable Integrated Processor Blocks - An integrated processor block of the network on a chip is programmable to perform a first function. The integrated processor block includes an inbox to receive incoming packets from other integrated processor blocks of a network on a chip, an outbox to send outgoing packets to the other integrated processor blocks, an on-chip memory, and a memory management unit to enable access to the on-chip memory. 12-29-2011

The following is software and was discussed by a Sony game engine developer.

20120176364 REUSE OF STATIC IMAGE DATA FROM PRIOR IMAGE FRAMES TO REDUCE RASTERIZATION REQUIREMENTS - An apparatus, program product and method reuse static image data generated during rasterization of static geometry to reduce the processing overhead associated with rasterizing subsequent image frames. In particular, static image data generated one frame may be reused in a subsequent image frame such that the subsequent image frame is generated without having to re-rasterize the static geometry from the scene, i.e., with only the dynamic geometry rasterized. The resulting image frame includes dynamic image data generated as a result of rasterizing the dynamic geometry during that image frame, and static image data generated as a result of rasterizing the static image data during a prior image frame. 07-12-2012

Hardware feature

20110320771 INSTRUCTION UNIT WITH INSTRUCTION BUFFER PIPELINE BYPASS - A circuit arrangement and method selectively bypass an instruction buffer for selected instructions so that bypassed instructions can be dispatched without having to first pass through the instruction buffer. Thus, for example, in the case that an instruction buffer is partially or completely flushed as a result of an instruction redirect (e.g., due to a branch mispredict), instructions can be forwarded to subsequent stages in an instruction unit and/or to one or more execution units without the latency associated with passing through the instruction buffer. 12-29-2011

The following I think is the thread fabric control linking multiple IP (CPUs and more) blocks. It's similar to what Super computers use.

20120192202 Context Switching On A Network On Chip - A network on chip (NOC) that includes IP blocks, routers, memory communications controllers, and network interface controllers, each IP block adapted to the network by an application messaging interconnect including an inbox and an outbox, one or more of the IP blocks including computer processors supporting a plurality of threads, the NOC also including an inbox and outbox controller configured to set pointers to the inbox and outbox, respectively, that identify valid message data for a current thread; and software running in the current thread that, upon a context switch to a new thread, is configured to: save the pointer values for the current thread, and reset the pointer values to identify valid message data for the new thread, where the inbox and outbox controller are further configured to retain the valid message data for the current thread in the boxes until context switches again to the current thread. 07-26-2012

If you follow the Linkedin profile for Eric Mejdrich he worked for IBM till 2010 on NOC and then moved to Microsoft. NOC would be the thread routing "Fabric" of a super computer (would also support distributed computing). Is the NOC being used a IBM IP???

I'm starting to come to the conclusion that Charlie was accurate in his Oban article. Older well understood 32nm SOI and IP from multiple disciplines to accelerate a smaller GPU. The smaller GPU makes Yield easier. I wonder at the DX 11.5 mentioned for the PS4. At the time that was posted DX was further along than OpenGL and the current Direct X standard is only 11.1. Could DX 11.5 be anticipating features coming in both Consoles.
 

Mario007

Member
For obvious reasons (Monster Hunter) one of them is Capcom for sure and the other one is likely Activison because of COD:Declassified,Platinum could be one of them too due to Bayonetta 2.
Well that's not really friction between developers and Sony though.

MH is a case of Nintendo offering hardware that is only marginally better than the PSP and Capcom has pulled a similar thing when it didn't produce MH3 for PS3 but went with Wii instead as they could just recycle assets (this is also the reason why there won't ever be MH on Vita).
Capcom still makes most money on the PS3 so I doubt they'll want to allienate Sony and their new console.

Activision have themselves to blame for COD:D. They gave the project to a shitty developer at the start and then to a less shitty one to make a deadline. I mean they had at least 2 years to make a COD for Vita, they blew it. All Sony did was market it like crazy, knowing it was a shit game. If anything Sony would want to stay away from Activision. Then again when you see all the Skylanders ads being PS3 ads, there is some sort of a partnership there.

Platinum are just weird, cocky and arrogant, though they do make very good games. They are now with Nintendo as they give them funding, but even say MGR is being developed on PS3 and ported to 360, despite Bayo being superior on the 360.
 

Ashes

Banned
Quick CPU question about Orbis. If it's a quadcore does that mean it's likely using Steamroller?

AMD have time and again declared that jaguar is for low powered solutions. But after CES I'm pretty sure whatever is under the hood in those chips is very very efficient. Demonstrably so.

I wonder what would happen were those to be overclocked to console standards. From a couple of watts to 50 watts or 100 watts.

I know I've not answered your question, but It's going to be pretty interesting.

I'm sticking to Steamroller though.
 

Sid

Member
Well that's not really friction between developers and Sony though.

MH is a case of Nintendo offering hardware that is only marginally better than the PSP and Capcom has pulled a similar thing when it didn't produce MH3 for PS3 but went with Wii instead as they could just recycle assets (this is also the reason why there won't ever be MH on Vita).
Capcom still makes most money on the PS3 so I doubt they'll want to allienate Sony and their new console.

Activision have themselves to blame for COD:D. They gave the project to a shitty developer at the start and then to a less shitty one to make a deadline. I mean they had at least 2 years to make a COD for Vita, they blew it. All Sony did was market it like crazy, knowing it was a shit game. If anything Sony would want to stay away from Activision. Then again when you see all the Skylanders ads being PS3 ads, there is some sort of a partnership there.

Platinum are just weird, cocky and arrogant, though they do make very good games. They are now with Nintendo as they give them funding, but even say MGR is being developed on PS3 and ported to 360, despite Bayo being superior on the 360.
->Publishers not developers,I included platinum in it just because Bayonetta 2 seems like a major reason for sony to get upset about considering the first one sold over a million on the ps3.

->Sony needed MH for vita and capcom was teasing it since it's reveal,then suddenly nintendo happened.

->Shitty treatment of COD vita hence the strain in relationship,

These relations aren't finished by a long shot though.
 
Top Bottom