Maybe the 802.11ac Wifi support has more to do with local LAN multiplayer. I think for a game like Splatoon with 10 (apparently) players locally in a single game the bandwidth requirements might get fairly high. Also you're pushing over 10x more pixels per Switch than you were per 3DS, though I don't know if that has much of an affect on wifi bandwidth requirements.
Resolution or graphics quality shouldn't really have any effect on bandwidth requirements in this scenario, all you need to transmit are the locations, velocities and so forth for movable in-game objects (i.e. players and physics objects). To give an example, the bandwidth requirements for multiplayer MK8D shouldn't be all that much higher than they were for MK7, which managed 8-player ad-hoc wireless multiplayer just fine on that 54Mb/s connection.
The wifi solution they're using is massive overkill for any local multiplayer scenario. It's possible that they want to enable really fast game downloads for people with high-speed internet connections, but MicroSD write speeds would probably be a bigger bottleneck than the wireless interface by that point.
And regarding the July devkits, if they were labeled as 2GHz max and 1GHz max for the CPU and GPU on the hardware side, and TBD on the "for applications" side, how would developers know what the max for their games should be? Would the devkits be incapable of actually reaching those max clock speeds? I guess that would be hard for us to know without hearing from a developer who used those devkits back then.
Having to deal with varying clock speeds would be pretty common for developers that far before the launch of new hardware, I would think. I would imagine that, if clock speeds are changing on a relatively regular basis, the SDK or firmware would indicate the clock speed, rather than having it in printed material (which would then go quickly out of date).
Anyway, if the 4k SCD dock is a thing (which I'm still unsure about) then I don't think 3x A57s at 1GHz would be a very suitable CPU for running 4k applications. I guess I don't know how CPU requirements scale with GPU power but I would imagine for 4k that you'd need something a bit closer to what the PS4Pro and Scorpio will have (aka what the PS4 and XB1 have).
In theory CPU requirements shouldn't scale at all with resolution. The only difference on the CPU front is sending a "render to a 3840*2160 target" command rather than a "render to a 1920*1080 target" command. If you were designing a console to render at that resolution from the start you may want a more powerful CPU to, for example, calculate more detailed physics simulations (e.g. use a denser mesh for cloth simulation, as it's more visible), but if your intent was purely to play existing games at a higher resolution, then you wouldn't need a CPU any more powerful than what your existing device already has. Hence why Switch doesn't (as far as we know) clock up the CPU in docked mode, because the higher resolution doesn't change the CPU workload.
That said, if the 200mm² chip is merely a stand-in (as I suspect it is), then the final device could use an SoC which also includes, say, an octo-core A73 CPU, or some number of the new cores Nvidia are designing for Xavier. If what they're designing is intended to be able to operate on its own then it needs a CPU, and there's nothing stopping them including a CPU as powerful as they deem necessary.
I'll just say this.... something is off. No way this is the console devs are saying Nintendo worked with third parties with on developing. We still haven't heard any major negative tone from developers about the specs. Something is missing. No way a stock X1 or even worse is in the switch and it is a developers dream and running full unreal 4 games in a week. Something is missing... what is the most "logical" answer? my personal opinion like I said before is whatever they put in the switch is nothing to write home about... but it does its job. It can get ports of xb1 and ps4 games without much hassle.
I honestly don't think most developers are quite as concerned with all-out performance as many people tend to assume. Once you've got a relatively sensibly designed architecture without any major bottlenecks I'd say their main concern is how good the development environment, tools and support are, and by all accounts Nintendo has improved on those fronts massively with Switch.