Question about frame rates
I know next to nothing about this, but recently was able to listen in on a discussion between people much more knowledgable than I about the difficulties of maintaining very high framerates for VR. For the purpose of this question, let's say we're rendering at a resolution of 1080p per eye, and we want a frame rate that averages somewhere between 90 and 120 fps, ideally with a minimum frame rate of 90 fps.
Obviously tradeoffs are necessary in terms of geometrical complexity, lighting, shading etc, but sometimes I get the feeling that part of the difficulty for VR maybe stems from developers and consumers trying to set the graphical bar a bit too high. This is where I'd appreciate it if GAF could help educate me on the matter.
What's bothering me is I seem to remember computers from well over a decade ago being able to achieve ridiculous framerates (300+ fps when uncapped) running Quake 3 or similar titles. Yes, the resolutions were no more than 1K whereas 1080p per eye is 4K. Yes, Quake 3 level graphical complexity is totally unacceptable even for a mobile game today. But if I tried to extrapolate that linearly to account for the drastic improvements in performance since then, I could come to the conclusion that a machine today would be able to run a version of Quake 3 that was properly optimized for today's hardware at 4K at an insane speed, somewhere between 500 and 1000 fps, or 1-2 milliseconds per frame. (Note: Which sounds rather ridiculous, to be clear).
I remember a talk by John Carmack a while ago (before he joined Oculus) where he talked about how even Doom 3 felt amazing in VR, and that it wasn't necessary to have the most complex graphics in order to feel immersion and presence. And with VR adaptors coming to smartphones it would appear others in the industry agree.
So basically my question is this: Has there been some fundamental change in rendering pipelines over the last 5-10 years that has made it that much more difficult to achieve such high framerates without significantly cutting back on important graphical features? As in, maybe many popular graphical features today are much more expensive than popular graphical features were a decade ago?
Or perhaps the bottleneck isn't in the graphics department but in other areas such as game logic, AI, I/O, etc? Because I've never heard of a pc running Crysis at anywhere near the same speeds that people were running Quake 3 a mere 3-4 years after it released, and that game was 8 years ago.
Please help me understand GAF, your assistance is appreciated!