• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Oculus Rift DK2 Thread

Zaptruder

Banned
Just tried this out and out of all of the dancing demos this certainly takes the cake. It's real well made, nice sense of scale and her dancing isn't janky like some of the other demos I've seen.

It still doesn't pull me in though. Like I see that "I'm in there," but I think it's more the content that's preventing me from really getting into it.

It might surprise you, but I'm a big Miku fan and I like a lot of the songs that have come from her. I think my biggest problem with these dance demos is more where you the user is situated. Like it doesn't feel "right" to me that I'm at most two or so meters from her while she's dancing and singing. Like, this doesn't feel natural. I almost see it as I'm a ghost and she's doing a practice performance, but she doesn't know I'm there which bothers me I guess. lol

This is where I can see tracking making a big difference where while she's dancing she occasionally gives you a glance.

There's a free movement version of that demo that allows you to move around the arena (and out of it if you want to). Q and E to control height. I think it's in the second link. Not sure (I originally downloaded it from another location - I googled those links for your benefit).

Also in that same version, Unity-chan does look at you while she dances (she tracks your movement and will even twist her neck backwards if you walk behind her :p).

It's possible to position yourself in an audience location - although she's much smaller... and while it's novel, and a kinda cool sensation of 'been there' - you also quickly realize that, there's little point chilling too far away from her.

I just like that she moves thoroughly throughout that space, so it needs head tracking to track her movements completely, justifying the use of VR over traditional presentation methods.
 

Soi-Fong

Member
There's a free movement version of that demo that allows you to move around the arena (and out of it if you want to). Q and E to control height. I think it's in the second link. Not sure (I originally downloaded it from another location - I googled those links for your benefit).

Also in that same version, Unity-chan does look at you while she dances (she tracks your movement and will even twist her neck backwards if you walk behind her :p).

It's possible to position yourself in an audience location - although she's much smaller... and while it's novel, and a kinda cool sensation of 'been there' - you also quickly realize that, there's little point chilling too far away from her.

I just like that she moves thoroughly throughout that space, so it needs head tracking to track her movements completely, justifying the use of VR over traditional presentation methods.

Oh, I didn't see the Free Movement. I'll try that version out. Anyway, are you planning on sticking with the 980 Ti for Vive and the Oculus.

I'm really fighting with myself whether I stick through the Vive launch w/ my 780 Ti and wait for Pascal.
 

Zaptruder

Banned
Oh, I didn't see the Free Movement. I'll try that version out. Anyway, are you planning on sticking with the 980 Ti for Vive and the Oculus.

I'm really fighting with myself whether I stick through the Vive launch w/ my 780 Ti and wait for Pascal.

I'll upgrade to pascal when that comes out... the way I figure it, with DX12 and multi-GPU scaling, SLI and X-Fire are going to become significantly more flexible.

And I'd expect that VR games and game engines are going to take advantage of multi-GPU scaling as much as possible in the next year and beyond.

What I didn't want to do was wait 6-8 months for new vid cards to release while getting frustrated at launch VR titles. :p
 

dark10x

Digital Foundry pixel pusher
AltspaceVR is fascinating.

Had a weird "cyberpunk" feeling moment in there. Loaded up the "space theater", wandered into the main auditorium, and stumbled across a lone person standing in there. He was speaking audibly speaking Chinese while surfing Baidu on a giant 100ft screen.

I didn't have a mic plugged in so I couldn't communicate. He attempted to talk to me, though, eventually switching to English saying "You know Star Citizen? It great game!" and then leaves.

So I follow him to the desert where a floating screen sits between two palm trees playing back a Vice video. I look around trying to find the guy and stumble across him and some other person standing behind a shack trying on hats in front of a mirror talking to each other.

It's just basic "virtual chat room" stuff, of course, but when you throw it into VR the whole thing feels completely unreal.
 

viveks86

Member
AltspaceVR is fascinating.

Had a weird "cyberpunk" feeling moment in there. Loaded up the "space theater", wandered into the main auditorium, and stumbled across a lone person standing in there. He was speaking audibly speaking Chinese while surfing Baidu on a giant 100ft screen.

I didn't have a mic plugged in so I couldn't communicate. He attempted to talk to me, though, eventually switching to English saying "You know Star Citizen? It great game!" and then leaves.

So I follow him to the desert where a floating screen sits between two palm trees playing back a Vice video. I look around trying to find the guy and stumble across him and some other person standing behind a shack trying on hats in front of a mirror talking to each other.

It's just basic "virtual chat room" stuff, of course, but when you throw it into VR the whole thing feels completely unreal.

Fascinating indeed!
 
I'll upgrade to pascal when that comes out... the way I figure it, with DX12 and multi-GPU scaling, SLI and X-Fire are going to become significantly more flexible.

And I'd expect that VR games and game engines are going to take advantage of multi-GPU scaling as much as possible in the next year and beyond.

What I didn't want to do was wait 6-8 months for new vid cards to release while getting frustrated at launch VR titles. :p

My plan is to wait out with my 6950 until the Vive comes out, then get that and whatever $400 card I can.
 

zeioIIDX

Member
So I thought of an idea I'd love to see come to fruition one day. Not entirely sure what applications this can be used for apart from viewing landmarks and interesting locales in real time or military reconnaissance with an air/land drone.

Basically, someone should create a smoothly rotating 3D camera with the same field of view as the Oculus Rift.

It would be networked so the Rift can connect to the camera which is in a remote location.

When the user moves his/her head up, down, left, or right, the 3D camera mimics those movements.

Gives the illusion that the user is actually in another location in real time rather than simply viewing static images captured previously.

Binaural microphones on the camera will capture 3D sound and enhance the experience especially when the user wears headphones.

A downside is that the experience might be laggy since calculations have to be performed over a network and it probably won't be very close to 1:1 as far as the time it takes for the 3D camera to mimic your movements and send the real-time video back.
 
So I thought of an idea I'd love to see come to fruition one day. Not entirely sure what applications this can be used for apart from viewing landmarks and interesting locales in real time or military reconnaissance with an air/land drone.

Basically, someone should create a smoothly rotating 3D camera with the same field of view as the Oculus Rift.

It would be networked so the Rift can connect to the camera which is in a remote location.

When the user moves his/her head up, down, left, or right, the 3D camera mimics those movements.

Gives the illusion that the user is actually in another location in real time rather than simply viewing static images captured previously.

Binaural microphones on the camera will capture 3D sound and enhance the experience especially when the user wears headphones.

A downside is that the experience might be laggy since calculations have to be performed over a network and it probably won't be very close to 1:1 as far as the time it takes for the 3D camera to mimic your movements and send the real-time video back.

Maybe I'm misunderstanding, but we already have 360 degrees cameras capturing video that you can then view on the right moving your head freely.
 
Has anyone tried watching entire movies via the Rift in one of the virtual theater applications out there like VR Cinema 3D, Riftmax, Cineveo, or even LiveViewRift?

If so, what was the experience like? Did it really feel like you were in a movie theater? How much did it strain your eyes and neck?

I can't help but imagine that with the right setup (sound system, chair, etc.) it'd be the closest you could get to an authentic theater experience in your own home!
 

Appleman

Member
Has anyone tried watching entire movies via the Rift in one of the virtual theater applications out there like VR Cinema 3D, Riftmax, Cineveo, or even LiveViewRift?

If so, what was the experience like? Did it really feel like you were in a movie theater? How much did it strain your eyes and neck?

I can't help but imagine that with the right setup (sound system, chair, etc.) it'd be the closest you could get to an authentic theater experience in your own home!

I've done it in the virtual drive in, but I found I preferred watching via MaxVR. It's pretty darn cool, but the resolution could be much better for that type of thing, the amount of pixels that actually end up used for the virtual screen is easily sub HD
 

Zaptruder

Banned
So I thought of an idea I'd love to see come to fruition one day. Not entirely sure what applications this can be used for apart from viewing landmarks and interesting locales in real time or military reconnaissance with an air/land drone.

Basically, someone should create a smoothly rotating 3D camera with the same field of view as the Oculus Rift.

It would be networked so the Rift can connect to the camera which is in a remote location.

When the user moves his/her head up, down, left, or right, the 3D camera mimics those movements.

Gives the illusion that the user is actually in another location in real time rather than simply viewing static images captured previously.

Binaural microphones on the camera will capture 3D sound and enhance the experience especially when the user wears headphones.

A downside is that the experience might be laggy since calculations have to be performed over a network and it probably won't be very close to 1:1 as far as the time it takes for the 3D camera to mimic your movements and send the real-time video back.

The latency would be too much for the end user. Depending on distance, even with the speed of light and no interchanges, you're still dealing with upto 100ms latency (and with all the interchanges, a lot more)

Much better to just use a 360 degree camera and steam all of it to the user... at high latency - but then they can look around freely without latency (because the data is there while they're turning their heads).

Future telepresence machines will likely have to incorporate that sort of 360 degree functionality.
 

Soi-Fong

Member
I've done it in the virtual drive in, but I found I preferred watching via MaxVR. It's pretty darn cool, but the resolution could be much better for that type of thing, the amount of pixels that actually end up used for the virtual screen is easily sub HD

Just got the Gear VR, and at least with the Gear's resolution watching movies is plenty doable.
 
How does the Oculus experience compare to something like Google Cardboard? I know it might sound like a silly question, but I've never tried any serious VR devices. Is it like comparing dial-up modem to Google Fiber or something? I'm already pretty impressed by cardboard despite its flaws, so I've really started to consider buying a real VR set.
 
How does the Oculus experience compare to something like Google Cardboard? I know it might sound like a silly question, but I've never tried any serious VR devices. Is it like comparing dial-up modem to Google Fiber or something? I'm already pretty impressed by cardboard despite its flaws, so I've really started to consider buying a real VR set.
Dramatically better. Two key features missing from Cardboard: low persistence and positional tracking. Low persistence means the image suffers from far less motion blur, which goes a long way to convincing your eyes that you're looking at something real. Cardboard only has rotational tracking, so you have to turn your head in a precise way in order to maintain the illusion. Positional tracking allows you to move more naturally as you look around. It's just not just about being able to lean in large motions - it's the subtle changes each time you shift in your seat that are translated 1:1 so it's much more comfortable overall.

But before you go and buy a DK2, consider that the consumer headsets will be just as much of a step forward once again. The Vive/Rift compared to DK2 will be how DK2 compares to Cardboard. I'd recommend waiting.
 
How does the Oculus experience compare to something like Google Cardboard? I know it might sound like a silly question, but I've never tried any serious VR devices. Is it like comparing dial-up modem to Google Fiber or something? I'm already pretty impressed by cardboard despite its flaws, so I've really started to consider buying a real VR set.
Also, an Oculus DK2 supports up to 75hz refresh, but Google Cardboard gets 60hz - this means that the DK2 responds much more quickly to tiny motions your head makes, which makes a large difference on how "real" it feels (as well as feeling better for people who get VR sickness). The final consumer Oculus Rift is going to support 90hz and have a higher resolution and improved optics, so will be a LOT better.
 

LaneDS

Member
Hadn't spent anytime with the DK2 since last August, so I dusted it off and messed with a bunch of demos that I hadn't tried yet including:

Windlands
Colosse
I Expect You to Die
Tatooine
Vox Machinae
Birdy Land
Welcome to Oculus

Listed Windlands first because it blew me away and felt like the first game I actually played to completion in VR. That's maybe more meaningful because I think I played uninterrupted for close to three hours and didn't have any problems with motion sickness (there's always some level of disorientation or a weird feeling in my stomach when you fall from great heights in VR though). Concept is simple but works really well (first person platformer where you can swing to certain spots using a left and right grappling/climbing hook) and was just a nice experience.

Colosse- short little non-interactive demo with a nice enough visual style. Not much to say here aside from decent sense of scale, and I disliked the whole "look at a certain part of the scene which may be behind you to carry on the scene" concept, or at least the implementation here.

I Expect You to Die- Need to spend more time with this, but some really neat tongue-in-cheek spy themed first person adventure/puzzler. Mouse controls work well enough but it's a game like this where you really highlight the need for a better input method for VR.

Tatooine- From the posts in here, this is actually what I tried first when I hooked up the DK2. Kind of cool but overshadowed by some of the other things in this post. Official VR experiences will probably be very cool just to explore familiar (or unfamiliar) worlds in VR.

Vox Machinae- Awesome. I remember the thread saying as much a while back. It has the same feeling you get when you play Elite Dangerous of feeling like you're in a cockpit with working gadgets, and looking out from it really conveys the feeling of being 30 feet off the ground in some walking tank. Loved the idea of aiming with head tracking, and think it controls really well. If a developer ever wanted a good excuse to reboot Mechwarrior, man, this would be it.

Birdy Land- Cute little super-French on-rails experience. One of the better put together ones that I've seen.

Welcome to Oculus- neat demo to show off some of the tricks that VR can do that I'll probably show off to friends and family as an initial primer.

All in all, was a good reminder of how amazing VR can be. Not sure if I'll be getting a CV1 or a Vive yet, but I'm still very sold on VR and am excited what's ahead.

Any other demos/games I should probably check out? There's a lot to dig through online.
 

Soi-Fong

Member
Posted in the Vive thread but here's some impressions from me.

Finally got to try. Goddamn.. Vive is the real deal. SDE was there, but they've done some filtering to where in some demos I literally did not see the pixels.

Glasses people like myself, glasses do fit, at least my average sized one did and the headset itself is definitely way more comfortable than DK2 or the Gear. I can honestly say that after a minute, I literally FORGOT I had a headset on my face. It was that comfortable.

Expectedly, wires are gonna be a problem. I stepped on them a few times and I hit the wall once or twice. Durante's suggestion of wiring through the ceiling is looking mighty attractive after demoing the Vive.

Latency wise.. I can honestly say no latency whatsoever. And I'm talking about the headset and the controller. I'd honestly be surprised to hear motion sickness with the hardware unless said hardware is struggling. About the controller, very precise, did not lose tracking and I was moving around too. In job simulator, I dropped a few eggs and vegetables and picking them back up from the floor to put in the pot, I never lost tracking and it was precise.

Demo wise, you guys already know the demos so I won't rehash. I can honestly say though, that hardware is not the only factor in presence, but content and how your environmenta reacts to you. An example of a simple, but big presence killer for me was a part in job simulator. Usually when I put things in a microwave, I tend to slam the door closed. In job simulator this was missing. I tried to do a familiar action, but the door would just stop immediately after you let go. It's those small things that can take you out of the moment.

Moving onto the other demos, the Blu was alright but it's hard for me to be impressed now with scale after having been constantly awed by scale in Elite Dangerous in the DK2. The Portal demo did not disappoint. Writing, content, DETAILS!!! Oh god, the DETAILS!!!! Atlas was so intricately modeled. Valve really cares about their franchises and it shows. This was a production ready scene and screams Portal immediately at you.

For VR developers like myself, after trying the Vive, this is seriously AMAZING hardware. But I noticed, experiencing these experiences that as the brain gets more easily tricked into believing what VR is showing you is real, the easier it is to tear through that fabric of reality. Case in point, me anally pointing out my issue with Job Simulator.

It seems even the people working the booth didn't know this swinging the headset around and such. It just makes me appreciate more 2hat Valve and HTC have done with VR.

I'm a veteran of VR with the DK2 and I can confidently say my mind has been blown once again by this medium. I cannot wait for release!
 

LaneDS

Member
Was it the HMD itself that made the Vive so much more impressive, the VR-focused inputs, or (easy answer) both? Curious how much of a difference we can expect with the next generation of HMDs (following the DK2).
 
Tatooine demo is pretty cool, get to use a landspeeder across a fair bit of terrain, explore inside the Cantina. Decently populated with various Star Wars aliens and props.

https://share.oculus.com/app/tatooine
Oh yay. Am not a fanatic of StarWars but this is a seriously well done demo, and open-top cockpits are just perfect for VR imo.

Posted in the Vive thread but here's some impressions from me.

...

I'm a veteran of VR with the DK2 and I can confidently say my mind has been blown once again by this medium. I cannot wait for release!
Thanks for the write up, I can't get enough of these. I want to try Vive badly, because SDE in DK2 is a killer for immersion for me most of the time.

There's a question I've been interested in asking ever since I built a to-scale digital replica of a room in my house for VR calibration, and now have some incredibly surreal false-memories from the experience.

I've had the DK2 for a long time, and so the physical act of putting it on and wearing it seems to have become too mundane for my brain to bother storing as a memory. My latest VR experiences, particularly in content that I've personally developed, are recalled as if they were real memories, or memories of vivid dreams where the VR experience is 'isolated' from the act of wearing an HMD and being in a physical space. When I first experienced VR, I feel that the excitement of wearing the hardware and being aware of it possibly became infused with the 'virtual' memory, so when I think back to my initial VR experiences, I have very strong memories of being aware of the real world at the time (probably helped by the fact that I had to concentrate on setting up the Rift correctly each time, and often had friends over and stuff).

Lately, I can recall virtual memories independent of my wearing the headset, sometimes where the virtual memory doesn't have an 'associated physical memory' - for example - I remember vividly standing in the middle of my half furnished room, examining a test object that I don't own in real life, but I can't recall that particular time that I wore the HMD. The virtual memory and real memory seem to have been un-linked, and the real memory trashed for being insignificant.

So my question is, how do you guys remember your VR experiences? For example, some of you who've recently demo'd the Vive: is your memory of the experience some kind of hybrid of being in a demo room with a headset on, and being in the VR space? Is there a separation of the two 'realities'? For people who have owned hardware for longer and the novelty of the hardware itself has gone, how are the memories of VR experiences when you recall them?

I'm seriously interested in this topic, but I haven't seen it brought up anywhere. I'm interested in how it will become in the near future, especially once VR is relatively mainstream and the novelty of VR has worn off enough. I'm especially interested in children's VR memories and how those false memories might be recalled as adults. If anyone has any personal insights, or knows of any links to studies, I think it would be a great talking point.
 

Zaptruder

Banned
Anyone here with a Gear VR?

Can someone download this render

http://render.otoy.com/forum/viewtopic.php?f=97&t=49330

And let me know how it looks (will need to download that image and download Otoy's ORBX media viewer on the Galaxy VR app)? I've done what I could on my end to make sure the image looks good.

Unfortunately a slip of the finger caused the render to get cancelled. Have tried to use photoshop to edit out most of the most obvious fireflies, but there's still some patches of sparkliness... just want to make sure they look ok.

Only have a day left to rerender and resubmit...

Cheers
 

Alo81

Low Poly Gynecologist
I had a long post about the Vive in the Vive thread, but I figured it might be relevant here as well. I went with my SO who doesn't really play games much, so I thought it was an interesting contrast in experiences.

________________

Me and my SO arrived around 9:30. We ended up getting our demo in around 12:30 and we were done by 1.

It was my first experience with real VR (I've tried cardboard before) and it was just fantastic. My SO was hesitant to try it, but I convinced her, even though she wasn't stoked about it. I'll recount my experience as well as hers as she told me about it.

You go into the truck and find yourself in a decently sized all black air conditioned room. You put the headset on, they put headphones on you, then they hand you the two controllers. I'm not sure what I expected but it was interesting to first put it on and truly see nothing. You're completely isolated from the outside world, and I was staring into empty blackness for a while because initially my headset wasn't displaying anything. They had to fix that, but when it finally was working I saw myself in a big white room with screens all around me that held different VR demo names. In each of my hands were paddles that looked like the controllers I held in my hand.

The left controller had a color wheel on it. You could move your thumb on the touchpad, select a color, then squeeze the handle and a balloon would inflate and shoot out. The thumb touchpad felt really finnicky and assuming it's the same way on the Steam Controller, I hope it gets improved. It didn't seem accurate enough, or perhaps the demo itself wasn't coded well. A tip for anyone else who tries it, the balloon will shoot off whatever direction the controller is pointing, even though the balloon fills straight up. So if you want it to shoot up and then try to hit it, make sure your controller is pointed up.

My SO noted that if after making a bunch of balloons you look up, you can actually see them all Edit: floating away above you which is a nice touch.

________________

After this, they load you onto a sunken wooden ship. You're underwater and can see fish and manta rays around you, as well as the ship itself broken. You can walk around the deck and its a neat little experience. I remember looking up, seeing the sun peering through the water, and having this strange switch in how my body felt, almost as if there was a small pressure around me. Here is where I had one of my first really impressive moments, walking to the edge of the ship and peering overboard. Looking down gives that genuine feeling in your stomach of "I'm really high up" and it was very striking. Down below the ship overboard was the wreckage from a plane as well, which was a fun little surprise. Here I had the thought "I wonder what my legs look like", followed by looking at where my legs should be - seeing nothing - and getting an uncomfortable feeling in my mind. I think this was my first time feeling that disconnect of "this isn't right" and it felt strange. Its manageable, but its a really distinct and unique feeling I don't think I've ever felt before. The lightbox tracking is magnificent. It really is 1:1. Every step felt right. My hand movements felt right. Reaching out towards a fish and watching it swim away gave me a big grin. Peering and leaning and crouching all felt perfect. I was probably a bit too excited, because more then a few times I ended up bonking a wall because I would move from place to place too swiftly, eager to explore. I got a little too into the world I guess!

At this point, I heard a creature behind me - turned around - and saw a huge whale coming closer. As it got closer the true size of it became apparent, and that was humbling. It really helps you appreciate how large these creatures really are. I've never seen a real blue whale up close, but assuming its accurate size it seems like a real behemoth.

My SO exploring the ship ended up on the left side peering over (I looked over the right side) and said the coral reef looked Edit: Cool, and the colors matched the rest of the demo well. When she heard the whale she turned and saw it, and when it got close she said she got really freaked out. It got near and she looked at how big its eye was and said for the rest of the time she turned away because she was too scared of the whale. A really cool thing she noticed is that when she turned away, she could still tell where the whale was because of the way it sounded in her headphones. So she was really impressed by the positional audio, as well as the massive size of the whale.

Also, my SO said she heard more than a few *thunks* against the wall next to her, so I guess she could hear me bumping!

________________

The next demo was a Job Simulator, which was a cooking game. They start you out and say "Make this soup" and list 4 ingredients, a tomato, a mushroom, salt, and a bottle of sauce. You look down and you can see two blocky cartoonish hands, and you're in a cartoony kitchen.

When my SO played, she said she turned to the fridge, opened it, grabbed the tomato, mushroom, salt, and sauce - then placed the first two into the pot and poured the second two in. The ingredients poofed away, turned into a can of soup. She grabbed the soup and put it on the delivery tray where it got picked up and taken away. Then she went to the next demo.

When I played, I went to the fridge, grabbed the tomato and mushroom, brought them to the cutting board. Then I grabbed an egg and threw it at the wall, which made it *SPLAT* and turn into a sunny side up egg on the spot. I picked up a rolling pin and started waving it all over the table and stuff didn't react as much as I'd hoped. The soup pot didn't have any physics so the rolling pin just sort of got stuck against it. I tossed it over my shoulder. There was a plate with bread on it next to the soup pot, which I flipped upwards, causing the bread to fly off onto a tray and the tray lifted up and presumably was taken to a customer. That was a good laugh. When I looked behind my shoulder, I found there was another counter top with other supplies on it, as well as a knife. I picked it up and held it infront of my ~VIRTUAL EYE~ because its not a real knife so I can do that! It was cool holding it inches from my face, then slowly pushing it into myself, because obviously I could never do that. I put the knife down on the counter, and wanted to pick it up to slice with which was a bit confusing. The default handposition you use to hold the remote is palms inward, and thus the in game model is also palms inward. But when grabbing stuff, you simply reach forward and press the trigger, so there was this disconnect where I would reach forward to grab it, but now I was holding the knife sideways, so I had to actually think it through, flip the knife, then grab it to be holding it in a proper cutting position. I went and sliced down on the tomato, and the knife instantly shattered like glass into a dozen pieces, which was a laugh.

Here, is where I had my second really uncomfortable experience, this time intentionally. I looked down, and I reached my hand through the table. My brain knows I SHOULDN'T be able to do that, but alas I was, and I felt this strange sensation in my wrist because of it. Again, it wasn't game breaking or anything, but just an odd sensation I'd never had before.

I was worried they might just skip the demo and move on so I decided to finish up the recipe, throwing the ingredients into the pot. I started pouring some salt in, then i decided to just throw the whole salt shaker + sauce bottle in. It turned into a soup can, which I transferred to a delivery tray, which they took and delivered.

________________

Next up was a 3D painting demo. You see a colored line start to appear in front of you, and watch as it turns into a flower. You can move around the flower and see the full size and depth of it. You look down and your right hand is a paint brush, with your left hand holding all your tool options. It has 3 sides, one showing a color wheel, one showing a list of paint types (brush, rainbow, leaves, stars, that type of thing), and the third which I don't remember.

I drew a little bit adding to the flower which was interesting for a short amount of time. I wanted to change the color, which appears in your hand as a color wheel, with the wheel being the same shape and position as the touchpad on the controller. When holding the controller in the default position though, you aren't facing any single side. If you swipe on the touchpad, you can see it rotate, but as soon as your finger comes off it snaps to default position. I asked "How do I change color?" and they said you point to the color you want with your paint brush.

At this point, I turned the left controller so the color wheel was facing me, then I pointed the right controller at it and saw a little pointer that it would eyedropper tool from and I said outloud "That is just brilliant." So I started painting things directly in front of and around my head. I ended up hitting myself in the face a few times, but no biggie~ It was cool just moving around in and through these paint strokes. After a short time, we moved on to the next demo.

My SO did not have a good time with this demo at all. She spent a long while struggling with getting the colors working, and trying to use the touchpad to select the color. She said the person explained how to change color but they didn't do it really well. After drawing a small amount, trying to change color to better suit the flower and being stuck trying to change colors for a while, the person moved them to the next demo. I think this is actually reasonable, and I think it was a bad idea for them to have the touchpad actually cause the tools to move. It was really confusing and if I had not asked and had the person tell me, I would likely not have figured it out for a while.

________________

The last demo was the Aperture VR demo. It is a fairly scripted demo that you can probably find detail of elsewhere, so I'll just talk about the parts that stood out to me.

It was very lightly interactive, and it would have been nice to have more stuff to just randomly fool around with in the room.

When the service door opens, and Atlas walks through was truly stellar. He is emitting sparks and moving in a very herky jerky manner. I crouched down and got really close to the door and to Atlas, and every time he shifted forward I jumped back. It looked really real and impressive. Me jumping back was also reflexive, which was a nice surprise.

At this point you basically pull out Atlas's innards, which expand outward into a really long extension, something like 8 feet out, of all mechanical interworkings. They give you a lot of time to look at it and make it move. So I made it spin so all the machinery was moving like clockwork, then I went to the very tip, and moved my head through it, actually putting my vision directly inside and through all of the machinery. It was insane. There was so much detail, and it looked so complex.

The parts eventually all fall from suspension to the floor, and the floor drops out beneath them. Where I was standing was on the last line of tiles still standing and the depths felt truly cavernous beneath, I felt a slight worry. I saw the tiles beneath my feet start to bend downward and immediately jumped back.

This was probably the most impressive part for me. The side wall comes off and GladOs lowers down. She is MASSIVE! Oh my god! Was this section designed to scale? In the games you really don't appreciate just how large and intimidating GladOs truly is. I know it sounds strange but the scale here impressed me more than the whale from the first demo, because I felt like I had context here. I've never seen a whale up close in person (neither have I GladOs) but I have experience with GladOs. I've played and enjoyed the Portal games and I had never realized her true scale. I think the really fine detail combined with the familiarity made this part the most impactful for me. It felt scarily intense. I loved it.

My SO did this demo also and was really impressed by it. The complaints she had was that she didn't really understand what to do because the instructions weren't clear enough. She didn't know what the charge station was where she was supposed to charge her controllers. She didn't know where the drawers were. She didn't know where the service latch was. Also, she said when Atlas entered she got really close because she wanted to read the text on his body but that it was too blurry to read up close. I had attempted similar and had the same problem. I don't know if it is a limitation of the Vive, a limitation of the stretched FoV, or simply a not high enough resolution texture. Either way, that was a very minor issue, and she still thought it was awesome. When Atlas's parts fall, she wasn't looking at the floor and when she eventually looked down she saw she was standing over the gaping hole and got really freaked out and jumped to land. Some real Loony Tunes shit right there lol.

________________

I'm really grateful that we ended up in rooms right next to each other, because while I was playing I did hear my SO talking to the guide person, laughing, and sounding really enthralled by the stuff she was doing and seeing.

My SO finished a bit before me so she was waiting outside, but when I met her out there she told me she was really, really impressed, loved it and wanted us to get one. She said the whale one was her favorite and she would love to just put a rift on before bed every night and fall asleep just getting to look around at new beautiful nature worlds. She was saying she'd love for there to be more ocean ones, a jungle one, a desert one, and just all kinds of nature. She is really into the idea of us getting an apartment with an extra room that we can use exclusively for VR. I was really surprised but she went from mostly disinterested to 100% sold on it and really excited. We're talking about Oculus Rift and HTC Vive and considering which we should try to get, and when.

It was an awesome experience and I really wholly recommend that if you get the chance you take it.
 

Jimrpg

Member
For people who have tried both Oculus and Vive which would you get if you could only get one? Interested to know your thoughts as it stands now. It looks like to me Vive provides the stand up and walk around experience as well as the sit down one, plus its backed by steam, however I'd expect Oculus to be a very good experience too considering they've been working on their product the longest.
 
Any other demos/games I should probably check out? There's a lot to dig through online.

Here's a few recent ones I tried that I thought were pretty good:

Trafalgar St Tunnel: A 3d scanned tunnel with graffiti on it, really good immersion in this one that includes controls for switching the look of the tunnel from when the graffiti first began to its current form. Includes audio tours from different people including artists that added their own work to the tunnel. You can really see the potential for future museum exhibits built for VR with this one.

https://share.oculus.com/app/trafalgar-st-tunnel

Neptune Flux: Underwater submersible game that's one of the better "cockpit" VR demos I've tried in awhile, thanks in part to the large viewport you're given with which to see the environment. Lighting really complements immersion here as distant reefs and objects can be made out with light shadowing. Definitely a mystery of what lies beyond feeling going on. Sort of an object collecting and salvage oriented style of gameplay with light story elements.

https://share.oculus.com/app/neptune-flux

Coelcanthe: Another underwater demo but stationary, no movement, though I believe there is Leap support. Kind of a nice fishtank style demo with different fish lazily swimming by you and lingering long enough for you to lean and look at the details on them. The focus is a fish called the Gombessa which "is seen as the ‘transition animal’ from backboned fish to the earliest four-legged vertebrate land animals; and with its lobe fins and ‘primitive lung’, this fish is the longed-for living proof of early life’s transition from water to land, which took place 370 million years ago." So that's pretty cool.

https://share.oculus.com/app/coelacanthe

FlyInside: FSX and Prepar3D Oculus support that includes asynchronous timewarp and it boy what a difference that makes. Don't get very good fps on a single 970 in either but the timewarp gives a rock solid 75 throughout. You can tell it's not as effective with sharp head turns but it's still a great feature to have when performance isn't great.

http://flyinside-fsx.com/

edit: Okay I gotta throw in one more recommend as this one I just tried is just fun as hell. Probably the first VR metal music video for a band called Gigatron. Nothing special assets wise but it runs great and the set pieces and music makes it worthwhile. Give it a whirl:

https://share.oculus.com/app/gigatron-hell-ride-song
 
Oh yay. Am not a fanatic of StarWars but this is a seriously well done demo, and open-top cockpits are just perfect for VR imo.


Thanks for the write up, I can't get enough of these. I want to try Vive badly, because SDE in DK2 is a killer for immersion for me most of the time.

There's a question I've been interested in asking ever since I built a to-scale digital replica of a room in my house for VR calibration, and now have some incredibly surreal false-memories from the experience.

I've had the DK2 for a long time, and so the physical act of putting it on and wearing it seems to have become too mundane for my brain to bother storing as a memory. My latest VR experiences, particularly in content that I've personally developed, are recalled as if they were real memories, or memories of vivid dreams where the VR experience is 'isolated' from the act of wearing an HMD and being in a physical space. When I first experienced VR, I feel that the excitement of wearing the hardware and being aware of it possibly became infused with the 'virtual' memory, so when I think back to my initial VR experiences, I have very strong memories of being aware of the real world at the time (probably helped by the fact that I had to concentrate on setting up the Rift correctly each time, and often had friends over and stuff).

Lately, I can recall virtual memories independent of my wearing the headset, sometimes where the virtual memory doesn't have an 'associated physical memory' - for example - I remember vividly standing in the middle of my half furnished room, examining a test object that I don't own in real life, but I can't recall that particular time that I wore the HMD. The virtual memory and real memory seem to have been un-linked, and the real memory trashed for being insignificant.

So my question is, how do you guys remember your VR experiences? For example, some of you who've recently demo'd the Vive: is your memory of the experience some kind of hybrid of being in a demo room with a headset on, and being in the VR space? Is there a separation of the two 'realities'? For people who have owned hardware for longer and the novelty of the hardware itself has gone, how are the memories of VR experiences when you recall them?

I'm seriously interested in this topic, but I haven't seen it brought up anywhere. I'm interested in how it will become in the near future, especially once VR is relatively mainstream and the novelty of VR has worn off enough. I'm especially interested in children's VR memories and how those false memories might be recalled as adults. If anyone has any personal insights, or knows of any links to studies, I think it would be a great talking point.


This is incredibly interesting. I can't speak on it at all as I've still not tried modern VR let alone owned a headset, but I would love to hear some thoughts on this.

Thank you for taking the time to write that out.
 

Yowza, haven't heard that one before. I have wondered once the laymen will get their hands on some rudimentary room scanning software that makes their living room/pc room into a virtual space if these kinds of disconnects will occur, had no idea people like yourself had already experienced it. I'm surprised you're getting that on the dk2, would imagine you'd need the full 90 for that sort of effect, but then again I've had my own weird experiences with VR, though most of them in the form of affecting my dreams. Before consumer VR was a thing I'd have dreams thinking I was in some hyper realistic simulation and would marvel at all the details in the world around me. Since using the real thing I have a meta dream where I think I'm wearing an hmd and seeing a VR world even though the details are far higher than what's possible today. It's kind of a leaping off point for lucid dreams in a way.

Most people invested in VR tend to dismiss the Magic Leap CEO but one of the things he said has always stuck with me and that's that VR "leaves footprints in the mind", basically once you're out of it your mind isn't done with the experience, it's trying to reconcile real world vs simulation and god knows what other loose ends it's trying to tie up elsewhere. VR is pretty weird territory for us imo and I think some people are going to have some pretty insane effects from long term use.
 
Endless possibilities.

How about a biofeedback demo?

Edit, or even more up my street: music visualisers such as milkdrop, or my old favourite - the one on the PSX dinosaur demo disc.

The one thing I've never understood why they don't fix with these things is the lag. Every visualiser I've ever seen is always a microsecond or two behind the beat, which makes them kind of dull. I don't really get why crisp & confident visualisation isn't possible.
 

Blizzard

Banned
Endless possibilities.

How about a biofeedback demo?

Edit, or even more up my street: music visualisers such as milkdrop, or my old favourite - the one on the PSX dinosaur demo disc.

The one thing I've never understood why they don't fix with these things is the lag. Every visualiser I've ever seen is always a microsecond or two behind the beat, which makes them kind of dull. I don't really get why crisp & confident visualisation isn't possible.
Nitpicking comment, but microseconds should be both imperceptible to the listener and impossible to control deterministically on a PC.

Milliseconds are another story though. It might also be because playing very low-latency audio can be tricky.
 
The whole issue is way beyond my technical insight. Apologies for using terminology so lazily. My point is simply that I've always felt disappointed, and don't get why the software can't just look ahead in the track! Probably a result of being plugins for existing players, rather than built from the ground up. But the experiences could be potentially transcendent if done right, especially in VR.
 

LaneDS

Member
Here's a few recent ones I tried that I thought were pretty good:

Trafalgar St Tunnel: A 3d scanned tunnel with graffiti on it, really good immersion in this one that includes controls for switching the look of the tunnel from when the graffiti first began to its current form. Includes audio tours from different people including artists that added their own work to the tunnel. You can really see the potential for future museum exhibits built for VR with this one.

https://share.oculus.com/app/trafalgar-st-tunnel

Neptune Flux: Underwater submersible game that's one of the better "cockpit" VR demos I've tried in awhile, thanks in part to the large viewport you're given with which to see the environment. Lighting really complements immersion here as distant reefs and objects can be made out with light shadowing. Definitely a mystery of what lies beyond feeling going on. Sort of an object collecting and salvage oriented style of gameplay with light story elements.

https://share.oculus.com/app/neptune-flux

Coelcanthe: Another underwater demo but stationary, no movement, though I believe there is Leap support. Kind of a nice fishtank style demo with different fish lazily swimming by you and lingering long enough for you to lean and look at the details on them. The focus is a fish called the Gombessa which "is seen as the ‘transition animal’ from backboned fish to the earliest four-legged vertebrate land animals; and with its lobe fins and ‘primitive lung’, this fish is the longed-for living proof of early life’s transition from water to land, which took place 370 million years ago." So that's pretty cool.

https://share.oculus.com/app/coelacanthe

FlyInside: FSX and Prepar3D Oculus support that includes asynchronous timewarp and it boy what a difference that makes. Don't get very good fps on a single 970 in either but the timewarp gives a rock solid 75 throughout. You can tell it's not as effective with sharp head turns but it's still a great feature to have when performance isn't great.

http://flyinside-fsx.com/

edit: Okay I gotta throw in one more recommend as this one I just tried is just fun as hell. Probably the first VR metal music video for a band called Gigatron. Nothing special assets wise but it runs great and the set pieces and music makes it worthwhile. Give it a whirl:

https://share.oculus.com/app/gigatron-hell-ride-song

Will be checking these out soon, thanks for the recommendations!
 

la_briola

Member
Just got mail.

Upcoming Oculus PC SDK 0.7 Compatibility Changes

tl;dr: 0.7 of the Oculus PC SDK will launch on August 20th, and introduces architecture changes that bring increased stability, performance, and a new ‘Direct Driver Mode’ developed in collaboration with NVIDIA and AMD.

However, as a result of these updates, the 0.7 runtime won’t support applications built with 0.5 or earlier, including all content built with Unity 4.x. This means the majority of existing Rift games and applications will need to be updated to 0.7 (or at least 0.6.0.1) to work with the new 0.7 runtime.

If you’re a developer with questions about updating to 0.7, please reach out to us through the Oculus Developer Forums. We’re here to help!
Oculus PC SDK 0.7 launching August 20th
As we prepare for the launch of the Rift, one of the key milestones is shipping 1.0 of the Oculus PC SDK. We’re making good progress on 1.0, and on August 20th, we’ll be releasing 0.7 publicly.


0.7 is a major release of the PC SDK: it includes architecture changes that bring increased stability and more reliable low-latency performance across recommended hardware through a new, more robust ‘Direct Driver Mode’.

However, as a result of these underlying changes, the 0.7 runtime won't support applications built with SDK 0.5 or earlier, including all current content built with Unity 4.x. This means the majority of existing Rift-ready games and applications will need to be updated to 0.7 or 0.6.0.1 to work with the new runtime.

We’ve outlined these changes, along with new information on how we’ll be handling updates to the SDK in the run up to 1.0, in more detail below.

Direct Driver Mode

As part of 0.7, we’ve removed ‘Extended Mode’, which commonly suffered from additional latency, and we’ve replaced it with a new ‘Direct Driver Mode’ that we’ve developed in collaboration with NVIDIA and AMD.


Direct Driver Mode is the most robust and reliable solution for interfacing with the Rift to date. Rather than inserting VR functionality between the OS and the graphics driver, headset awareness is added directly to the driver. As a result, Direct Driver Mode avoids many of the latency challenges of Extended Mode and also significantly reduces the number of conflicts between the Oculus SDK and third party applications. Note that Direct Driver Mode requires new drivers from NVIDIA and AMD, particularly for Kepler (GTX 645 or better) and GCN (HD 7730 or better) architectures, respectively.


Runtime and SDK Compatibility through 1.0

We’re targeting a November release for the Oculus PC SDK 1.0. Future updates to the runtime post-1.0 will continue to support games and applications built using 1.0 (or any later release).


However, until 1.0 is available, each new release of the runtime will only guarantee support for the previous version of the SDK. This allows us to more rapidly evolve the software architecture and API on the path to shipping 1.0. In the case of 0.7, the runtime support will be limited to applications based on 0.6 and 0.6.0.1.
Upgrading to 0.7
We realize that there are a significant number of games and applications based on older versions of the SDK, and we’re working hard to make the path to 0.7 as smooth as possible.

For Unreal developers, the current UE4 integration is based on 0.6 and we’ll ship a 0.7 integration alongside the core SDK.


Developers working with Unity 5.x can leverage the direct VR support built into Unity, which uses 0.6. We’re working with Unity to update Unity 5.x to 0.7 so that Unity 5.x users don’t have any additional work.

For developers that need to remain on Unity 4.x, we're releasing a 0.6.0.1-based plugin, and we’ll continue to provide basic support for Unity 4 with future SDKs. However, we recommend Unity 5.x for the best SDK support and development experience.

If you have your own engine and you’re running into difficulty upgrading to 0.6 or 0.7, we’re happy to assist directly; please reach out through the Oculus Developer Forum. We’re here to help!

Thanks, and we look forward to seeing you soon in the Rift!

-- The Oculus team
 
Wicked. Hopefully all of our favorite software gets updated in a timely fashion.

I'd still love to see a 0.6.2 with W10 support show up before then, but I'm thinking that might be unlikely now.
 
Gimme windows 10 and give me Elite updating their implementation of it. >_<

Still wish Elite worked well in direct mode, instead of the lame Extended mode.
 
So will this new runtime support Windows 10 then? I already miss having access to the DK2. :(

Although not stated, I'm sure it will be. They've already said they're working as quickly as possible to make it happen. Ideally W10 support will come sooner, but 0.7 is a big project, so if it doesn't I'd understand. It would be a waste of resources for just a few weeks more of 0.6.
 
Top Bottom