• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD's Raja Koduri says that we need 16,000x16,000 at 240Hz for "true immersion" in VR

http://www.overclock3d.net/articles..._need_16k_at_240hz_for_true_immersion_in_vr/1

AMD's Raja Koduri, the head of the Radeon Technology group, has said for a long time that VR is going to the a driving force behind the advancements in GPU performance for many years to come and that he won't be happy until we can run games a 16K at 240Hz within his lifetime, saying that that would be the point where we will achieve "true immersion that you won't be able to tell apart from the real-world".

When I set the goal, I said, "We need to get here in our lifetime." We can't do that with Moore's law and hardware alone. We have to unleash software on this problem. We've been working with developers on all of these ideas. How can we get 16K by 16K displays refreshing at 240Hz with the picture that you want to draw? Developers want more control, on their side. They want console level GPU access on the PC.

What they've been able to achieve on consoles in the current generation, versus the current high-end PC-The current high-end PC specs are at least four to eight times faster than current consoles. The Fury X is an eight teraflop machine. The PS4 is a two teraflop machine. It's four times more compute in that single Fury. You can build a dual Fury PC. But PC doesn't give you that much better an experience with cutting edge content, because they can extract more performance from a console. They're also investing a lot of IP into that architecture. They're doing some really clever things that are not possible on the PC yet.

Raja Koduri says that we will have to rely on a lot more than Moore's Law in order to get into the "VR Era", where resolutions and refresh rates will be much higher than today's Oculus Rift or HTC Vive headsets. Large and intelligent changes in both GPU and software architectures will be the key to driving up performance while lowering power consumption, which is exactly what the AMD's Radeon Technologies group is working towards.

really neat article. i hope hbm2 can help achieve what developers are looking for - it'd be a wonderful sight to see these modern GPUs pushed to their theoretical limits in games.

the thought of a 16k x 16k display, let alone vr headset, that runs at 240Hz... makes me shiver thinking about the pure cost of that

overheat if old
 

ekim

Member
If VR really takes off than the current HMDs now are what 3D graphics were 1996 to us. Cool and all but in 20 years...
 

Malygos

Member
Where is he getting 240hz from? I thought Oculus did psychological studies that found ~88hz was where people stopped unconsciously noticing flickering. 16x16k per eye sounds about right though for "true" vr.
 

Pif

Banned
Also wearing huge nerdy bricks in your face is not cool.

Can't wait until stylish and slim VR headset options get developed soon so there is social acceptance towards wearing it in public in the future.
 

Snafu

Neo Member
Palmer Luckey mentioned 8K per eye two years ago.
http://arstechnica.com/gaming/2013/...esolution-per-eye-isnt-enough-for-perfect-vr/

"To get to the point where you can't see pixels, I think some of the speculation is you need about 8K per eye in our current field of view [for the Rift]," he said. "And to get to the point where you couldn't see any more improvements, you'd need several times that. It sounds ridiculous, but HDTVs have been out there for maybe a decade in the consumer space, and now we're having phones and tablets that are past the resolution of those TVs. So if you go 10 years from now, 8K in a [head-mounted display] does not seem ridiculous at all."

This is nothing new. AMD showed this slide at the Fiji launch event last year. Replace "tomorrow" with 20 years
17.jpg
 
10 years from I'll bump this thread from my VR and post L-O-L 16k

It's all about 24k

It's a lot further out than 10 years, 4k isn't standard yet (if it ever will be) and then there's 8k that's still out there. For gaming we're probably still another GPU generation (the one after this upcoming generation that is) out before a single GPU can handle 4k at 60FPS with max settings (then a reasonable price for it might take longer) let alone 120, 144 or 240.
 

Locuza

Member
AMD is in for a rude awakening when they get on Intarwebz for the first time and learn about foveated rendering.
As if it would be something new for them.

I've heard about this before, but forgot to follow up. What's the quick and dirty explanation?
You try to render/draw the points the human eye is looking at, at a very high resolution, but around the focus point you decrease the resolution.
The consumer doesn't notice and you don't have to render the whole scene with the native resolution, which obviously saves a ton of processing resources.
 

DieH@rd

Banned
16k by 16k was the same figure that was mentioned by Michael Abrash years ago when he talked about what our eyes can actually perceive. As for framerate, he said that low persistence flicker rate will have to be increased with resolution bump, so 240hz does not surprise me.

AMD is in for a rude awakening when they get on Intarwebz for the first time and learn about foveated rendering.

Why would they be worried about this? Foveated rendering will work on all rendering hardware as soon as eye tracking sensors become introduced to VR headsets.
 

PGamer

fucking juniors
Where is he getting 240hz from? I thought Oculus did psychological studies that found ~88hz was where people stopped unconsciously noticing flickering. 16x16k per eye sounds about right though for "true" vr.

Just because people don't notice flickering anymore doesn't mean there isn't still room for improvement otherwise. 90 Hz isn't the end game for frame rates.

I've heard about this before, but forgot to follow up. What's the quick and dirty explanation?

More or less the image is only rendered at full resolution where the user is looking and the resolution is lowered as you move farther from that point. Since the eye can only see full detail where it is focused and not in its peripheral vision you can save processing power without perceivable quality loss.
 

Kieli

Member
AMD is in for a rude awakening when they get on Intarwebz for the first time and learn about foveated rendering.

I'm sure several of the brightest computer scientists and engineers with innumerable PhDs and thousands of years of experience among them do not know the concept of foveated rendering.
 

rambis

Banned
Übermatik;192403821 said:
Everyone's going crazy but I don't think it's far-fetched to say we'll see this level of tech in 8 years.
Not a chance if we're talking single GPU. If we consider a 980Ti as sufficient for 4K at 60hz(hint:its not) then we would need at least 16x the power. And thats just a rough game agnostic estimation. Real world requirements are usually much more unforgiving.
 

rrs

Member
by that time, why not just get a chip in my brain and surf the VR world like in all the cyberpunk things ever
 
AMD is in for a rude awakening when they get on Intarwebz for the first time and learn about foveated rendering.

Sure foveated rendering will help a lot, but screen pixel density is still going to be an important part of creating a convincing image in VR. You are still going to see the pixels of the screen when they are pressed up close to your face.
 

blastprocessor

The Amiga Brotherhood
Where is he getting 240hz from? I thought Oculus did psychological studies that found ~88hz was where people stopped unconsciously noticing flickering. 16x16k per eye sounds about right though for "true" vr.

So why are Sony delivering 120Hz? There are also 144Hz monitors.
 

ItIsOkBro

Member
Where is he getting 240hz from? I thought Oculus did psychological studies that found ~88hz was where people stopped unconsciously noticing flickering. 16x16k per eye sounds about right though for "true" vr.

Well 1 Intel hertz is about 3 AMD hertz so
 

tokkun

Member
he won't be happy until we can run games a 16K at 240Hz within his lifetime, saying that that would be the point where we will achieve "true immersion that you won't be able to tell apart from the real-world".

It will take more than high resolutions and refresh rates to make it indistinguishable from the real world.
 

Chumpion

Member
Why would they be worried about this? Foveated rendering will work on all rendering hardware as soon as eye tracking sensors become introduced to VR headsets.

I was just pointing out how full of shit AMD is when they pretend foveated rendering doesn't exist. Because that's where the >1 PFLOPS figure comes from.
 
Where is he getting 240hz from? I thought Oculus did psychological studies that found ~88hz was where people stopped unconsciously noticing flickering. 16x16k per eye sounds about right though for "true" vr.

Flicker free and lag free are two different things. There's going to be latency at 88hz that will be noticeably there.
 

vpance

Member
16k by 16k was the same figure that was mentioned by Michael Abrash years ago when he talked about what our eyes can actually perceive. As for framerate, he said that low persistence flicker rate will have to be increased with resolution bump, so 240hz does not surprise me.

Apparently the fovea is only around 8MP in resolution which is basically 4K. Too bad our damn FOV is so large, lol. If we want the full FOV of 220 the res of the screen needs to be massive.
 

CariusD

Member
These slides ignore foveated rendering for a good reason. AMD can't bet on the tech maturing at any specific point in time.
 

WolvenOne

Member
So, we're talking about 500 PS4's duct taped together?

....I'm going to need more tape.

Edit: for context, my money is on the next round of consoles being roughly 12x as powerful as the PS4. That'd be overkill for 1080p visuals, but not so much for 4k and VR.
 
Top Bottom