See, I do not think 4k is high enough to achieve a good IQ (epecially in games that lack a good TAA implementation, in which case they have awful IQ even at 5k). Both 1440p and 1080p are so far below 4k (a resolution I do not consider as high at all); trying a find a difference between the two is basically splitting hairs. If a game has a lot of shimmering/flickering/pixel crawling at 1080p, rest assured that it will also have a lot of shimmering/flickering/pixel crawling at 1440p. 4k is the bare minimum where you start to see a difference.
You should perhaps learn the difference between the words "good" and "perfect".
4K is more than enough for
good image quality. Without good anti-aliasing it is certainly not
perfect though.
Bad image quality is a lack of anti-aliasing, poor texture filtering, sub-native resolution rendering etc.
Just because the image is not entirely free of all forms of aliasing and does not look like an offline render, does not mean that it has bad image quality - only that it does not have
perfect image quality.
There is still a big difference between 1080p and 1440p, with the latter being almost double the resolution.
Just because it is nearly double the resolution and not 4x or 16x does not mean that there is hardly any difference. Doubling the resolution
is a big difference.
It's like saying there is no difference between 720p and 1080p. Or 480p and 720p.
I'm not saying that 4K or 8K are worthless; I hope that it won't be long before those resolutions are actually playable on a single GPU.
However I still don't consider a 1080 Ti to be enough for single-GPU 4K gaming today, because I want to keep framerates above 60 FPS.
I'm really curious to know what kind of display you're using, since you're so dogmatic about this.
Based on your comments, I assume it's the smallest 4K panel you could find.
And image quality is not just about resolution:
https://timothylottes.github.io/20161114.html