It means everything, especially if you want to discuss two SKUs aiming for two different resolutions, which you and many guys here seem not to understand the whole concept. Rendering the same graphics at 4K instead of FHD requires 2-3 times the computing power, depending on how advanced the game is, so OBVIOUSLY (?) the console that aims for 4K has to have that much more powerful GPU to render the same graphics. And vice-versa - the one that aims for 1080p doesn't need to be god knows how powerful. Because what do you expect? That the base model would have 12TF GPU and the 4K a ~30TF one? Not going to happen, the technology is not there, and won't be anytime soon,. I know there are many PC guys lurking in the console threads that would like to get a 2-3k$ PC specs within a 400$ box, but sorry, we can't have a serious discussion if some of you have set yourself for such unrealistic expectations.
I'm personally not even sure if we will see 12TF in the most expensive next-gen console - previous/current gen AMD had ~13,5TF Vega as their most advanced GPU, and the most powerful console didn't even got even half of that (6TF in X1X), and that was the biggest Polaris GPU released at the time, not even PC got a Polaris GPU with so many cores, until 590 arrived somewhere later. And it's no differ this time around - Radeon 7 offers ~14TF, and AMD themselves already stated that this will be their flagship GPU for a while, while Navi is indeed aimed for low-mid end segment, so the question is - how much less powerful Navi will be compared to R7? Because 12TF compared to that big ass, hot, and power hungry GPU which R7 is sounds like a wet dream for me. So I'm very skeptical in terms of next-gen consoles GPU power, at the end of the day Navi will be still GCN based, the next GPU after Navi will be "next-gen" as the roadmap shows,and there's not much more juice that can be squeezed from GCN architecture anymore.
PS. And people need to stop with that "X times" comparisons already, it sure used to be a good marketing/PR thing back in the day, but it simply doesn't hold up nowadays - sure, going from 8GB RAM to 16 would be "just" 2x more, compared to let's say 16x when going from PS2 32MB to PS3 512MB, but I'm more than certain that the developers prefer to gladly take and additional 8GB with open hands as oppose to not even half a gig more. Same for the GPU - 3x seems to be super small jump, but in plain numbers that's an extra 2,7TF of computing power to play with, at the same 1080p resolution, to put into perspective - PS3 to PS4 was a ~1,6TF jump, and some of that power had to go to bum the resolution from 600-720p to FHD, and we all know how great the games look like.
So yeah, 4TF vs 12 seems like a fair and balanced split between FHD and 4K, not so sure about the 12 vs 16GB of RAM tho? Unless the bigger model has some additional GB of low-performance RAM reserved for the OS. OR - if Navi uses HBCC technology that allows to decrease the VRAM usage dramatically., I can see that happening.