• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Limitations of VR - a virtual reality check

Krejlooc

Banned
Regarding Sim Sickness for anybody who needs a primer:

So the terms for general vr sickness, sometimes erroneously called motion sickness, is actually an umbrella term for several similar sicknesses caused by different things.

Some people get vertigo from playing vr - technically not a fault of the tech (its acting as intended) but rather poor software design. I can easily induce vertigo in a demo by placing someone high up, as you would get vertigo in real life doing this. Which is to say, doing things in VR that would make you sick in real life, will probably make you sick in VR as well. You can't expect to be Solid Snake in Metal Gear Solid The Twin Snakes - part of the reason we don't do backflips left and right all over the place is because doing so, for an average person, would end in vomit. It's no different in VR. Some early demos do stupid shit like making you run at the IRL equivalent of 70 mph through rotating corridors and people wonder why they get sick. With Half Life 2 VR, we've actually spent significant amounts of time adjusting walking and running speeds, redesigning areas to cut back on crazy jumps you need to make - because expecting players to do that is unfeasible. We're not superhumans IRL.

Some experience sickness because they can perceive the latency between their head moving and the world updating - low persistence strobing oled has effectively solved this. By strobing the display with essentially black frames fractions of a seconds after they scan out, our bodies take advantage of a natural phenomenon our brains use to "fill in" the gaps of the missing pieces of animation. Hence, by simply not drawing anything to the screen at all, our brains will do the missing work for us, which winds up feeling much more comfortable.

Some experience vestibulochoclear disconnect, where there cochlear fluid in their ear isn't moving the way their eyes say they are. This is essentially unsolvable at the moment and depends on your personal limits. There are two frames of thought on how to solve this:

1) electric stimulation of your cochlear to make you feel like your choclear fluid is actually moving (good luck getting a guinea pig for that, zapping your brain with an electric charge)

Or 2) actually make the person move irl to cause harmony between their cochlear and what they see.

In terms of vestibulocochlear disconnect, not all motions are created equal. Cardinal translation isn't bad - moving forward, backwards, stafing - we use parallax cues to figure out the expected motion and our bodies adjust, only feeling discomfort at the start of the motion, and mild at that. Rotation is the killer, so the solution is to either place the player in a swivel chair so they can physically rotate or some other omnidirectional treadmill.

Our studies have uncovered an interesting phenomenon, however - expected motion severely limits your discomfort. To conceptualize this, we built a demo using positional tracked hands. In the demo, you can reach out and grab the world by closing your hand, at which point the movement of your hand translates the world around you. In essence you are grabbing and shaking the world.... nobody gets sick. Play back the same translation without using your hands, people get sick to their gills.

With that in mind, we're toying with a hand operated method of locomotion where you basically "swim" through the environment without using your feet, and it's producing neat results.

Still other people get sick because they can perceive the screen flicker - 90 hz is where it becomes imperceptible pretty much universally. At 75 hz on the dk2, I can perceive it in my peculiarity periphery but it doesn't make me sick. 120 hz is preferred over 90, however, because 60, 30, 24, etc divides evenly into 120, allowing for native frequency playback of, say, standard television or movie content within the context of a virtual screen. Samsung can actually drive Gear VR up to 70 hz, but since it has a focus on media playback, 60 hz was deemed a better refresh rate because it could natively play back movies at their natural frequency.

EDIT: Forgot one - some people got sick with Dk1 simply because the tracking wasn't close enough to their IRL position, and we have strong proprioception in our head and hands (the ability to know where we are in 3D space without visual cues). Our heads would say we were moving into one position, and our eyes would say another. The biggest offender was the lack of any positional tracking in DK1 at all - only pitch, yaw, and rotation, no X, Y, or Z. DK2 does X, Y, and Z in a forward 180 degrees. CV1 and Morpheus will do X,Y, and Z in full 360 degrees.

This is often something people miss - the need to match our proprioception as close as possible is incredibly, massively important and something that didn't really approach acceptable levels until the last 5 or so years. You hear it repeated often - we need sub-millimeter accuracy with our positional tracking, or else most people can tell that it's "off."

In short, nobody can tell you if your girlfriend will get sick without knowing why she gets sick playing fps games in the first place. She'll just have to try it. My sister in law got violently ill from dk1 for over a day, yet she could do gear vr for hours. Apparently the low persistence solved her sim sickness.

EDIT: I typed this on my phone, I'm going through and correcting all the mistakes.
 

Octavia

Unconfirmed Member
I'm a bit confused. People are playing modded Skyrim fine with the rift. That game looks pretty good modded still today and it isn't at all optimized for that.
 

Maggots

Banned
Virtual boy is nothing like VR. It was like strapping a really shitty 3DS to your face.

Fatty Cell Phone is nothing like iPhone. It was like holding a really Heavy Brick to your face.


It was just a joke man chill out

still fixed
 

Buggy Loop

Member
The actual design side? Oh man, so much experimentation at the moment. i've tried dozens of methods trying to solve the problem of lateral rotation - when you rotate using an analog stick you encounter vestibulocochlear disconnect that makes many sick. I've tried all sorts of wacky implementations, and none of them really work. What I've resigned to is the idea that VR games need to be played in swivel chairs where people can actually rotate when they need to turn, so they don't get sick. But who knows - maybe someone will fix that problem in the near future. I would welcome a great solution with open arms.

That exact problem i encountered in Half life 2 (with DK1 which does not help). Im kinda sad that since then there has been no real solution but im not surprised at the same time.

Did you work with Michael Abrash for Valve's VR btw?
 

Krejlooc

Banned
Yes, and that's the just the start. Morpheus launch is still far away. It's not like you need 4xMSAA to make a great VR game.

Uh, MSAA is actually extremely important for virtual reality. Aliasing is hell in VR, it creates shimmers and sparks in your eye. Valve's VR design bible recommends using 8xMSAA if at all possible as a minimum.
 

Krejlooc

Banned
People are playing modded Skyrim fine with the rift

They aren't playing Skyrim "just fine." As an example, to get skyrim running at all well in VR, you need to disable all shadows from the get go. Getting skyrim working well in VR is very hard to do.
 

Peltz

Member
Krejlooc, you already have mentioned playing games like F-Zero GX in VR.

I'm really not concerned about limitations when something like that is already possible. That is already good enough to prove satisfying to me and many other gamers.
 
Uh, MSAA is actually extremely important for virtual reality. Aliasing is hell in VR, it creates shimmers and sparks in your eye. Valve's VR design bible recommends using 8xMSAA if at all possible as a minimum.
yeah, that's been my biggest issue with the resolution of 3ds's screen, the 3d effect can cause intense shimmering without aa and the worst part about shimmering in 3d is that it results in edges and particles and thin geometry being displayed at screen surface depth, as that's what happens when little details are only seen with one eye.
 

Krisprolls

Banned
Uh, MSAA is actually extremely important for virtual reality. Aliasing is hell in VR, it creates shimmers and sparks in your eye. Valve's VR design bible recommends using 8xMSAA if at all possible as a minimum.

Whatever... My point is the VR (Morpheus or Rift) demos we already have running have fine graphics and we don't need more for good VR games. What matters isn't the graphics here, but the scale / immersion / presence.
 

Krejlooc

Banned
That exact problem i encountered in Half life 2 (with DK1 which does not help). Im kinda sad that since then there has been no real solution but im not surprised at the same time.

Did you work with Michael Abrash for Valve's VR btw?

I got to spend time with him at Dev Days. I don't actually work for valve, however, no. By the time I started talking with their VR team seriously, he had already left. Most of my team's communication has been through Joe Ludwig, Paul Kirschbaum, and Agusta Butlin.

But Cevat Yerli did contact us directly about employment at Crytek.
 

Afrikan

Member
So is Eve Valkyrie run by magic? Because it looks a hell of a lot better than HL2.

no, mostly tricks.

this should handle it fine.

FPLTGBYGS4EJ0MH.LARGE.jpg



kidding people... :D
 

QaaQer

Member
I played Half Life 2 with a DK2, man the opening to that game in VR is incredible. What got me was the woman holding up to the chain-link fence at the beginning of the train station. That was a sense of true immersion for me. Walking around, looking at other people just idling was mesmerising, in a weird way; you'll obviously know exactly what I mean since you've played with it but people who haven't won't fully understand.

What is new, becomes old. Seeing people experience vr for the first time is great, when it is the thirtieth time, not so much. After a certain point, it comes down to how interesting the game is and not the novelty factor of of experiencing presence for the first time.
 

viveks86

Member
Anyone cares to comment on this post? How would a game like Dying Light be playable on VR maxed on a single 970? Not the best looking game out there, but still qualifies as a a pretty complex game, right? I see that framerate is not consistent, but the problems mentioned in this thread sound far more insurmountable with current hardware.

I dunno if GTA 5 would be super demanding, probably not gonna be able to do ultra. I was able to run Dying Light using the console enabled dk2 support the devs had in there which was averaging around 60-70 maxed on a single 970. Then again, lower resolution than Vive/Oculus CV1, plus a not consistent framerate. It will definitely be a challenge, but I expect things like DX12, Liquid VR, whatever Nvidia is calling their VR latency pipeline to help out. I'm optimistic.
 
So would we be willing to say comfortably, that the PS4 could power a game like Wind Waker, at 60hz, in the Morpheus's FOV with little issue? These are the kinds of graphics that I'm mentally preparing for.
 

Elsolar

Member
Uh, MSAA is actually extremely important for virtual reality. Aliasing is hell in VR, it creates shimmers and sparks in your eye. Valve's VR design bible recommends using 8xMSAA if at all possible as a minimum.

This seems counter-intuitive to me, since modern engines don't benefit very much from MSAA to begin with. Why waste so much computing time on 8x MSAA when it's only going to affect geometry edges? Most of the aliasing I see in modern games comes from shaders (specular aliasing in particular is getting nasty at 1080p) and alpha textures. Wouldn't someone developing for VR want to improve the overall sample count as much as possible?

So would we be willing to say comfortably, that the PS4 could power a game like Wind Waker, at 60hz, in the Morpheus's FOV with little issue? These are the kinds of graphics that I'm mentally preparing for.

I don't think 60 hz is enough for VR, every company that's working on the tech has said 90 hz is their bare minimum. And 90 Hz is extremely hard to pull off in an open-world scenario on a PS4. I think this is the point that needs to be driven home. Yes, of course you could make the game look like GL Quake and it would run at 4k 90 hz on the GPU side, but mixing high frame rates with open world game structures is extremely CPU-intensive. There's no guarentee that even the games of last-gen will work at 90 Hz on PS4 without tangible design trade-offs.
 

Krisprolls

Banned
So would we be willing to say comfortably, that the PS4 could power a game like Wind Waker, at 60hz, in the Morpheus's FOV with little issue? These are the kinds of graphics that I'm mentally preparing for.

Definitely. It will be able to do a lot more than that actually, Wind Waker was originally a gamecube game if I'm not mistaken. The Heist Morpheus demo already has far more demanding graphics than Wind Waker.
 

klaushm

Member
Certainly, VR game design practices are the wild west here. I can split this up into several parts of development, however. From a purely visual standpoint, a lot has, thus far, been centered on doing things that prevent people from getting sick. This, as an example, is where the utility of low persistent displays came from - originally it was assumed that merely an astronomical framerate would be enough to eliminate sickness when, by stepping back and examining a bit closer how our own eyes deal with motion, it was realized that by basically strobing the display we could reduce some sickness. But the original requirement for high framerates didn't exactly disappear.

The visual side of things - what needs to be done is pretty clear. The ways to reduce or solve those problems is also clear. It's sort of a holding pattern situation where what needs to happen is that these technologies need to get faster, cheaper, and more accurate. That happens through iteration. I see this all the time with Augmented Reality - I try telling people that it's likely several years out, but we can demonstrate the entire process right now. There's a difference between "being able to do something" and "being able to do something well enough" and that threshold for these visual technologies is when people stop getting sick.

The actual design side? Oh man, so much experimentation at the moment. i've tried dozens of methods trying to solve the problem of lateral rotation - when you rotate using an analog stick you encounter vestibulocochlear disconnect that makes many sick. I've tried all sorts of wacky implementations, and none of them really work. What I've resigned to is the idea that VR games need to be played in swivel chairs where people can actually rotate when they need to turn, so they don't get sick. But who knows - maybe someone will fix that problem in the near future. I would welcome a great solution with open arms.

As I assumed, most of this was unkown to me hehe. I know you are pioners who need to do a lot of tests and build the firsts libraries of codes. Motion sickness must be a pain in the ass just by thinking that the camera will have free will to move any direction, and you have to adjust the sensibility. Just this part would be enough to have a hard work. Reading what you said, the complete case, is far worse.

VR Film in particular is an incredibly open cavas. All the rules of editing have sort of been reset. I like the demo Nuren by ViRT because he proved that jump cuts work in VR - many believed they were unworkable. Fox did a small demo of Wild that I thought was really cool - a problem with VR is that the audience can look in any direction, so how do you, as an example, frame shots or make sure the people are looking in the right direction? Fox's wild threw those rules out, it was a single continuous take that was filmed in multiple segments. Things happened all around you. And depending on which direction you were looking, the program would cue up additional film snippets. So like, you could make characters not appear entirely by never looking where they should walk into the frame, which sort of changed the story. In that regard, it feels kind of like a very advanced FMV game... but it's way more film than game.

Wow. They did an amazing example of work.
I got to admit that tought it was a bit funny reading it at first. When you said "Fox's wild threw those rules out(...)", I tought you would said they give the person a choice to look at what is important, or fuckyourself.
They made you the center point of their movie's world. It is really interesting. The most simple explanation for this, is like old RPG books where you choose a path, and needs to jump to page X so you could see what comes next. But translated within a movie, imersed in it with VR, controlling with only your sight. If a person is not aware of this, they wouldn't even notice it. AMAZING.

Other directing techniques are lifting from theatrical plays. Doing things like dropping lights or using visual and audio cues. Positional audio technology (as opposed to simply binaural audio) will provide another cue to hone in on.

The process of, like, directing VR games/applications/film is super open and experimental. I think everyone really knows where the hardware is going, it's just a waiting game. It's the software that's pretty exciting. And, again, some of these limitations will also drive design. I think we won't see many FPS games in general, to be honest, but we might see a rise of, say, ATV games. because, as weird a distinction as it is, riding a virtual ATV is less sickening than walking in a virtual land.

Thank you for your response. I will gladly wait for it's development, and try to be part of it.
Now it's very hard to be part of it...
 

ban25

Member
Micromanaging the Mono GC is something you have to do for any non-trivial game on Unity. That doesn't mean VR is automatically a problem in other engines.
 
Definitely. It will be able to do a lot more than that actually, Wind Waker was originally a gamecube game if I'm not mistaken. The Heist Morpheus demo already has far more demanding graphics than Wind Waker.

Oh definitely, I know that much lol, I'm more referring to dynamics of scale, specifically regarding things like geometry and draw distances and such. I imagine Morpheus in it's current state could muscle a game like Wind Waker with no problems. The on-screen enemy count isn't terribly high, and the designs are simple.

I've always though art was going to take precedence over raw power output for the earliest stages of VR. I agree that people need to get their expectations in check, but expecting pure garbage just seems like a false conclusion.
 

camac002

Member
but mixing high frame rates with open world game structures is extremely CPU-intensive. There's no guarentee that even the games of last-gen will work at 90 Hz on PS4 without tangible design trade-offs.

People just need to simply forget about open worlds in the beginning. That's what they should prepare for.
 

arter_2

Member
VR and game engines will have lots of limitations but they can get optimized like Epic's UE4 Morpheus 'Showdown' demo that runs on 60fps on the PS4: http://www.roadtovr.com/epics-showd...-on-morpheus-after-ue4-optimizations-for-ps4/

The Unity engine seems like it's going to be left in the dust when it comes to VR but time will tell

http://www.neogaf.com/forum/showpost.php?p=165009894&postcount=103
http://www.neogaf.com/forum/showpost.php?p=165010596&postcount=106

I'm just saying I completely agree many people are going to need to get their expectations in check, especially if the Morpheus is for the PS4.
 
http://www.neogaf.com/forum/showpost.php?p=165009894&postcount=103

I'm just saying I completely agree many people are going to need to get their expectations in check, especially if the Morpheus is for the PS4.

To be fair, Sony instructed dev's to target 60 FPS, not 90 :p So we'll see how that pans out. Many people who've tried the 60 FPS demos that were scaled to 120 hz, couldn't tell much difference from the native 120hz games, but they were different demo's, so it's not exactly a fair comparison.
 

Bsigg12

Member
Tomorrow should be a good check for people. With Oculus getting out and showing off the CV1 and presumably the games and experiences that will be available at launch, we'll see what kind of graphics Oculus is expecting early in this VR marathon. You can then extrapolate from what is shown what the PS4 may be capable of.

It's exciting times right now. All the changes in development to really focus heavily on performance should lead to some interesting developments across the board.
 

arter_2

Member
To be fair, Sony instructed dev's to target 60 FPS, not 90 :p So we'll see how that pans out. Many people who've tried the 60 FPS demos that were scaled to 120 hz, couldn't tell much difference from the native 120hz games, but they were different demo's, so it's not exactly a fair comparison.

In My experience 60 is not fast enough. If you go with 60 you need it to be locked down, any sort of slow down easily produces motion sickness.
 
In My experience 60 is not fast enough. If you go with 60 you need it to be locked down, any sort of slow down easily produces motion sickness.

Definitely, and while it's not indicative of an actual game, the Demo's Sony were showcasing, like the heist, were running at 60, scaled to 120hz, and people are blown away. So we'll see how well Sony's solution works, and see if that 60FPS floor becomes a problem.

I think it helps Sony immensely building games around a 120hz display, 60 scales well, and the timewarp becomes a constant.
 

Zaptruder

Banned
1. Krej likes to talk like he's *the* authority on VR. He's not - he's a VR developer, and developers aren't in lock step in their opinions and findings. He's well informed - but he's been wrong in the past, even while waxing lyrical like he has here.

2. VR is quite limited from a hardware and performance perspective if you must absolutely respect the latency minimization and highest frame rate ethos.

3. People can acclimatize to less than optimal VR solutions.

4. DX12 will make huge differences to PC-centric VR performance by significantly optimizing draw calls.

Which isn't to say that I disagree with Krej - but rather, it's not quite as dire as he makes it sound - at least not once we start getting into the real consumer era of VR (as opposed to this current opening dev era of VR).

The best VR experiences will be those that design for VR with all of its limitations in mind... and while I wouldn't expect an open world VR game on par with GTA5, Witcher 3, or even Skyrim out of the bat, we can still expect a variety of visually stunning experiences - because we've already seen them.

To be fair, a lot of them are terribly optimized for most PCs at this point in time... but that's the beauty of PC development in many ways. We get to sample things that we really shouldn't be able to run, but can because no one's going to tell you, you *can't* because it's 'only' running at 40-60 fps.

With that been said... I think the VR industry should establish a rating scale for VR immersivity/motion sickness early on. It'll signal to consumers what good VR experiences are, it will reduce the incidence of people feeling motion sickness unintentionally, and it will provide clear guidelines to developers of the kinds of things needed for a great VR experience - where violating them sends your game to a lower rating standard.

On the flipside, some genres and mechanisms and interaction methods will necessarily cause a game to be less than optimal. Contrary to repeated wisdom on the matter - cockpit games do still induce vection and vestibular mismatch. The trick is the user can shift their motion frame of reference to the cockpit (similar to how we can keep our motion frame of reference to our room even when we're playing a video game with a lot of motion in it), although this isn't perfect and doesn't work for all users. As a result, cockpit games premised on the idea of movement (you're in a car, you're in a mech, you're in a plane - you're going somewhere fast) when your body isn't will never meet the top tier of VR motion sickness reduction.

But that's ok - as long as people know, it just means we can have a range of experiences in VR, ranging from 'no worries', 'to I hope you've got a cast iron stomach' and everything else in between.
 

Dr. Kaos

Banned
Krejl00c is not saying anything controversial or unknown among VR enthusiasts.

Yes, today, classic rendering pipelines are slow and framerate drops are tolerated. All that goes out the window with VR, which means...

...Longer development times for VR games.

Devs have to debug and optimize code, work harder on level design, object design, etc., to make sure the game is structured to have no frame drops 99.999% of the time, they have to work harder to make the game look pretty because of drastically limited polygon/rendering time budget, and so on.

Then, you have all the non-technical issues. How to make sure the players are engaged and excited without making them sick. How to mitigate the low visor resolution for text or interactive elements. How to make the controls as intuitive, precise and robust as they need to be, etcetera.

In 10 years, we'll likely have foveated rendering figured out, giving us a huge boost in performance. Engines will have mature, rich VR-optimized pipelines for another boost in performance. These boosts will be necessary to drive the 16K OLED HDR headsets that will be sold then. Hand Input has been figured out and people wear gloves.

in 20 years, it'll be virtually impossible to tell the difference between headset 360 videos and the real world. Same thing with sounds. Headsets will be wireless, connecting at ultra high speed with your light-bulb using pulses of colored light. Game graphics will still not be photorealistic, but they'll be close enough that people won't care. Input now only requires wrist bracelets and some neural readers hidden in the headset.

in 30 years, the headsets will be contact lenses or glasses, graphics can be made to look indistinguishible from the real world. Progress comes from tactile feedback, force feedback and smell. Taste is going to be trickier to figure out. People will routinely hangout in VR, thanks to ubiquitous 10Gbps connections and *slightly* reduced latencies.

in 40 years, the first sentient strong AI will appear. 10 years later, it's the matrix and we're all in VR, all the time. Sweet.
 

FleetFeet

Member
http://www.neogaf.com/forum/showpost.php?p=165009894&postcount=103
http://www.neogaf.com/forum/showpost.php?p=165010596&postcount=106

I'm just saying I completely agree many people are going to need to get their expectations in check, especially if the Morpheus is for the PS4.

I think you might of overlooked the most important part of the news in that article, which I feel is that the optimizations to UE4 for VR, which also translates into gains for PM.

The great news about this? The enabling factor is general optimizations to UE4 on PS4, rather than specific optimizations to the Showdown demo, Hoesing tells me. That means an increase across the board for UE4’s performance on the PS4. This will be a boon for VR developers that hope to deploy UE4 projects cross-platform between Morpheus and other VR headsets.
 

arter_2

Member
I've ripped that demo apart technically it is still a lot of hacks. Yes it is great that they have made optimization that doesn't take away from the fact that much of what makes that demo great is hacked in. Its completely on rails and fails to use many modern lighting and shadowing techniques. It is fine to post but it is a disingenuous look at true VR.
 
Anyone cares to comment on this post? How would a game like Dying Light be playable on VR maxed on a single 970? Not the best looking game out there, but still qualifies as a a pretty complex game, right? I see that framerate is not consistent, but the problems mentioned in this thread sound far more insurmountable with current hardware.

Maybe Dying Light wasn't the best example, out of curiosity I searched r/oculus for dying light and found reports of judder, problems with nausea due to performance etc. It definitely wasn't running like butter for me but it wasn't unpleasant either, certainly playable.

Alien Isolation scaled much better for me, though not the greatest comparison to gta as it's not open world.

Wish someone had figured out how to get geometry 3d injected support working for gta v by now, even though performance is usually poor using this method :/ had a blast using tridef on certain games like sonic generations and Metro last light. This was on dk1 though and perf was still iffy.
 

PGamer

fucking juniors
Uh, MSAA is actually extremely important for virtual reality. Aliasing is hell in VR, it creates shimmers and sparks in your eye. Valve's VR design bible recommends using 8xMSAA if at all possible as a minimum.

Out of curiosity is Valve's VR design bible publicly available? I'd be interested in reading it. I went through the Oculus VR best practices guide but it's pretty limited.
 

FleetFeet

Member
I've ripped that demo apart technically it is still a lot of hacks. Yes it is great that they have made optimization that doesn't take away from the fact that much of what makes that demo great is hacked in. Its completely on rails and fails to use many modern lighting and shadowing techniques. It is fine to post but it is a disingenuous look at true VR.

Forget the demo for a second. Are you just going to ignore the claim that UE4 is seeing a benefit due to those optimizations?

Edit: Now when you say you ripped the demo apart technically, are you talking about the PC version running on CV1? Because that is the only version they have released, as far as I'm aware.
 
Krejl00c is not saying anything controversial or unknown among VR enthusiasts.

Yes, today, classic rendering pipelines are slow and framerate drops are tolerated. All that goes out the window with VR, which means...

...Longer development times for VR games.

Devs have to debug and optimize code, work harder on level design, object design, etc., to make sure the game is structured to have no frame drops 99.999% of the time, they have to work harder to make the game look pretty because of drastically limited polygon/rendering time budget, and so on.

Then, you have all the non-technical issues. How to make sure the players are engaged and excited without making them sick. How to mitigate the low visor resolution for text or interactive elements. How to make the controls as intuitive, precise and robust as they need to be, etcetera.

In 10 years, we'll likely have foveated rendering figured out, giving us a huge boost in performance. Engines will have mature, rich VR-optimized pipelines for another boost in performance. These boosts will be necessary to drive the 16K OLED HDR headsets that will be sold then. Hand Input has been figured out and people wear gloves.

in 20 years, it'll be virtually impossible to tell the difference between headset 360 videos and the real world. Same thing with sounds. Headsets will be wireless, connecting at ultra high speed with your light-bulb using pulses of colored light. Game graphics will still not be photorealistic, but they'll be close enough that people won't care. Input now only requires wrist bracelets and some neural readers hidden in the headset.

in 30 years, the headsets will be contact lenses or glasses, graphics can be made to look indistinguishible from the real world. Progress comes from tactile feedback, force feedback and smell. Taste is going to be trickier to figure out. People will routinely hangout in VR, thanks to ubiquitous 10Gbps connections and *slightly* reduced latencies.

in 40 years, the first sentient strong AI will appear. 10 years later, it's the matrix and we're all in VR, all the time. Sweet.

Oh hai Kurzweil.

All of that does sound very cool though, predicting advancement of VR and input is fun.
 

viveks86

Member
Out of curiosity is Valve's VR design bible publicly available? I'd be interested in reading it. I went through the Oculus VR best practices guide but it's pretty limited.

This is not the design bible, but I found this quite insightful:
http://media.steampowered.com/apps/valve/2015/Alex_Vlachos_Advanced_VR_Rendering_GDC2015.pdf

Maybe Dying Light wasn't the best example, out of curiosity I searched r/oculus for dying light and found reports of judder, problems with nausea due to performance etc. It definitely wasn't running like butter for me but it wasn't unpleasant either, certainly playable.

Alien Isolation scaled much better for me, though not the greatest comparison to gta as it's not open world.

Wish someone had figured out how to get geometry 3d injected support working for gta v by now, even though performance is usually poor using this method :/ had a blast using tridef on certain games like sonic generations and Metro last light. This was on dk1 though and perf was still iffy.

Metro last light sounds like a good benchmark as well, if it worked. What about Crysis?
 

arter_2

Member
Forget the demo for a second. Are you just going to ignore the claim that UE4 is seeing a benefit due to those optimizations?

I really cannot speak to that honestly, UE4 4.8 was just released today I haven't not had the chance to use temporal re-projection. So It remains to be seen. My issue with VR is that the current CG pipelines and the efficiency built into them tend to breakdown with VR. I always see this demo brought up as a triumph for VR fidelity but I have seen it many times Including on a prototype build of UE4.8 at GDC and it really didn't run well. Like I have said in the previous thread there needs to be fundamental shift in the way assets are created for VR this has not happened and until then many games are going to be hacked together like this demo or will be smaller in scope and experience. I have developed and published UE4 oculus experiences and I am not discounting we have come a long way since UE4 4.4.
 
Metro last light sounds like a good benchmark as well, if it worked. What about Crysis?

Don't recall trying Crysis. Couldn't try Crysis 3 as tridef didn't support dx11 at the time.

Obviously a gen behind in complexity but og Bioshock gave me very good performance and was a very immersive experience in VR. I'd like to see devs really up their art design and world building for vr titles, as it does make a world of difference towards immersion.
 

Z3M0G

Member
Seems like a reality check for mobile and console VR, which should always have been obvious though.

Things like Colosse should be what those tier devs look towards rather than trying to half-ass photo-realism:
MQ00Gr0.png

This is the kind of stuff i expect this gen and im very happy with it.

Would it be very taxing to play 360 degree video? Perhaps we will see the return of PSX era-esque FMVs for major story sequences
 

viveks86

Member
Don't recall trying Crysis. Couldn't try Crysis 3 as tridef didn't support dx11 at the time.

Obviously a gen behind in complexity but og Bioshock gave me very good performance and was a very immersive experience in VR. I'd like to see devs really up their art design and world building for vr titles, as it does make a world of difference towards immersion.

Neat... I always believed in the future of VR, but as a prospective consumer, I was super pumped for its present as well. My confidence was shaken quite a bit today. Somehow I'm feeling better after reading all these posts.
 

Bsigg12

Member
This is VRs saving grace. The presence factor more than offsets the downgrade in graphics.

As long as they maintain performance. Some Gear VR game jam games are nauseating when they start dropping frames. Even simple things can chug if not developed properly.
 

orioto

Good Art™
This is VRs saving grace. The presence factor more than offsets the downgrade in graphics.

I was trying a nice little soft where you can browse FFXI models at 1:1 scale yesterday, with the DK2. The way those pretty simple models comes to life. Each little texture, polygon, edge has so much presence. Those little bats omg.. they are not even animated, they are just there hanging in the air and they have 10 times more impact on you than any million poly beast in any game, cause they are fucking there and you can look at them like if you could touch them.
 

Lathentar

Looking for Pants
So I have a question about third person character control using a Vr headset. Most third person games have the main character's movement be relative to the camera's orientation. With Vr is the head stable enough to keep a consistent control reference frame? I imagine most players are quick enough to make subtle adjustments to the controls as the camera moves. Is that the case?

I could imagine vr being a bit nauseating with using the right analog to rotate the camera around the player. Seems like lucky tale has the camera on a spline or surface and the vr input is just an additive on top of that position.
 
Top Bottom