I'd take sparse rendering/checkboarding at 4K over native 4K any day fo the week, considering the impact on IQ and of course the impact on performance at shading all 8+ million pixels.
Now, sure, If I'd own 2 GTX 1080's and a 4K monitor I'd pick full rendering, but I only own 1
My Monitor is 3440x1440p, utlra wide and not quite as pixel dense as a full 4K TV (thank goodness because it's hard to run at ultra and 60 FPS at 4K in some games even on a 1080).
Watchdogs 2 with sparse rendering turned on looks nearly identical to it running without it, on my PC. The main issues are: barely noticeable artifacts, especially around edges/shadows when the camera is moving, and a small loss of sharpness. But it cuts down shader load on ym GPU so that I cna run the game at near ultra settings at 60 FPS.
When you're talking about sitting 6+ feet away from a TV, I really don't see the point of shading all 4K's pixels every frame. Odds are, aside from the artifacts, that you wouldn't notice the difference in softness unless you had a relaly big TV or were sitting very close (or had very, veyr good eyesight - lucky bastard).
Pretty much any PC game that offers it, and where my GPU can't max at 60 FPS, I'll be turning it on.
I think the confusion over image quality on the PRO and sparse rendering/checkerboarding stems from MANY Pro games using the technique NOT actually targetting a 4K frame. Many are running at 1800p, or lower, or using dynamic resolutions on top of sparse rendering.
This is going to significantly soften the frame, as you are losing image detail, and in some instance might make artifacts more pronounced. You have to remember that the main performance boost of checkerboarding is a lower requirement on the GPU compute/shaders (IIRC), but the GPU still needs to do a lot fo other things. That 4.8 TF GPU has it's limitations.