• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Arstechnica hands-on Vive VR [3D controllers standard, no market fragmentation]

tuxfool

Banned
You've completely missed my point. A camera that could track in future at any room scale accurately without the need for separately purchased Lighthouses is a better solution for that scenario, or even going outdoors under the assumption of wireless headsets. What is lighthouses scalability into headsets or going wireless / outdoors - that is entirely unclear, and from what we know about how it works, it won't work in such a way. That is what I mean by there being merit in what Oculus does with optical tracking solutions now leading into 5 years on

The problem with camera tracking is one of scalability. To track larger areas you're going to need one of two things or possibly both.

1) The camera will need to scale in resolution and light sensitivity. The further away you go the greater probability you run into aliasing issues. To combat that you increase the resolution. Increasing the resolution you greatly increase the computation load required to detect constellation patterns. Adding more cameras also becomes very difficult as they all need centralized communication, thus no matter how good a solution you have you still run into occlusion problems.

2) The LED power on the headset will need to increase in larger areas/outdoors as IR can easily be lost on hot days or bright lights. This particular issue is the main issue faced by Lighthouse (but the LEDs are in a place where it is easier to accommodate more power).

Image processing algorithms certainly are useful, and their research is invaluable for a general use case, however it is a worse solution if you just want to track a known object in three dimensional space. Typically computer vision is applied when you need general image recognition and you cannot easily determine what object you want to track or the object isn't of fixed geometry. Valve started in the same place as Oculus with their QR code rooms, but their final solution is far more elegant.

I'm not sure how having a wireless headset changes things the main problems remain the same.
 

Tain

Member
Really curious to see what will be available at launch for the Vive. That GIF is really, really impressive.
 

EVIL

Member
I mean I get how it will lead to crazy game experiences...but I really don't want to get up and walk around to play or experience games.

I'd have to move my PC to have the space thats for sure.

Its pretty much the only way to fully eliminate sickness in VR
 

kyser73

Member
Vive is definitely the ultimate form of VR going forward and will probably become the de facto standard on PC. Oculus have just sat on their hands, seemingly not doing all that much even with huge financial backing of Facebook and their strategy will do more damage to VR than help it...

I'm going Morpheus as my gateway into VR. I have faith in Sony and there will be great fun software on there and hopefully all the VR cinema, 360º video kind of experiences as well. If I ever have the ability to save the money for Vive, or when it becomes more and more financially accessible I'm definitely getting the spare room in my flat equipped. The future is near!

Me too. I've actually got the required space to set Vive up, but with 2 still young kids (18mths and 5 years) I don't have the time or money to devote to a full PC rebuild and VR room creation (I'd want to do it properly with a ceiling cable manager and suchlike).
 

EVIL

Member
It still baffles me why all of these VR companies seem to just be stuck on using "controllers" instead of a glove.

Why is it the one thing Nintendo was actually on the right track with back in the day and it seems like all of the big players are just trying to avoid anything remotely near that design?

gloves suck for the following reasons

* Sensors are still to large to make gloves skin tight.
* Gloves need to be powered
* Gloves can irritate the hands (sweat, itch)
* To have comfortable gloves, they need to be custom made for your hand.
* You need to be able to wash the gloves.

It seems like such a great idea.. but its not something that is the best solution right now.
 

kyser73

Member
Re: controllers. Isn't it still the case that, even with the additional movement allowed by Vive, you're still going to suffer from sim-sickness when you try and strafe?

From my very limited time using a couple of DK2s, when I was in a mech suit it wasn't an issue, as soon as my representation was standing human any left-right movement I initiated without first slightly re-orienting my body resulted in a sensation of sickness, but not normal motion sickness. It felt as if it were coming from the base of my spine rather than stomach, it was low-level but noticable and it was persistent once I took the headset off.

How would a VR-specific controller resolve this?
 
No idea on the specifics, but whatever they do, it works.
lol Right on.

The sync pulse is exactly what it does, sync up the timer running on the headset. Yup, the scan pulse delay will be known (I doubt speed of light enters into consideration in a 20m radius).

I assume the native coordinate system of Lighthouse is polar, so the arc distance (azimuth) is determined by the time when the scan beam sweep first reaches the headset after the sync. The distance from the pole (radius) is determined by the time between individual beams sent as part of the entire scan beam/beams (the laser is split into a series of beams using some sort of grated mirror). As you get further away from the pole (base station) each beam spreads out increasing the time between individual beams.
Hmm. That might make sense. So rather than a single x-pass as shown in the GIF, there will be multiple passes on the x plane, and the timing between them determines the distance from the beacon? Yeah, that could work, if that's really what they're doing.

There are two lasers in each station and each comprise of a separate coordinate system rotated by 90º.Thus the intersection of each determines the coordinates in 3d space. I imagine attitude of the headset is determined by the visibility of the beam by each photo diode.
Can you explain how the coordinate system works?

The stations don't need coordination between each other. Only the headset needs to know which is which. The signals are all easily seperated as they're all determined according to the sync pulse (which probably what tells the headset what base station the scan beams are coming from).
But if every beacon emits identical sync pulses, and at more or less random times, how does the headset know which beacon emitted which pulse?


Could be as simple as having a channel selection dial on each base station. Just set each station to the next available channel as you set them up.

Have the station dials set to 1 and 2 out of the box, problem solved. People would only need to adjust them if they wanted to buy a bunch to cover larger spaces.
That would make sense, but it seems like kind of a 20th Century solution. Why not just give the beacons the ability to communicate? Just so you can say they don't need to? =/


Re: controllers. Isn't it still the case that, even with the additional movement allowed by Vive, you're still going to suffer from sim-sickness when you try and strafe?

From my very limited time using a couple of DK2s, when I was in a mech suit it wasn't an issue, as soon as my representation was standing human any left-right movement I initiated without first slightly re-orienting my body resulted in a sensation of sickness, but not normal motion sickness. It felt as if it were coming from the base of my spine rather than stomach, it was low-level but noticable and it was persistent once I took the headset off.

How would a VR-specific controller resolve this?
It can't, really. The reason strafing makes you feel sick is because it's actually an incredibly unnatural movement. You don't really strafe in real life. Mostly, you just walk or run forwards, and in a pinch, you can clumsily shuffle backwards, or even to the side a bit.

One type of movement in VR that doesn't produce much-if-any sickness is something called ratcheting, which controllers like Move, Touch, or whatever the Vive controller is called can help with. Basically, you just reach out, grab yourself a big fistful of air, and pull yourself towards it. Imagine being in a zero-gravity environment, holding a handrail attached to the wall. You can move yourself up and down, forwards and backwards, or left and right. Now imagine the same thing, but instead of being restricted to the handrails, you can reach out and grab "nothing" instead. It seems like a bizarre way to move yourself around, but it's actually really comfortable, because there's no abstraction in the movement; your body is moving exactly as it should, based on the movement of your hand.
 

tuxfool

Banned
Can you explain how the coordinate system works

I don't know if they convert to cartesian coordinates or if they remain polar. But consider Horizontal polar coordinates given by arc (theta_x) and a radius(r_x). Now you have another plane oriented vertically with theta_y and r_y. For this to work r_x=r_y but theta_x and theta_y can be different. Thus you have 3 variables that you can use to describe a position in 3d space relative around a point (in this case the base station).

Of course the API probably combines all the coordinates determined from the base stations as a single set of Cartesian coordinates.

But if every beacon emits identical sync pulses, and at more or less random times, how does the headset know which beacon emitted which pulse?
As I said I don't really know but I could see the sync beam that acts as a preamble to the laser sweep have a has a randomized pulse train. It could also be some other method.

That would make sense, but it seems like kind of a 20th Century solution. Why not just give the beacons the ability to communicate? Just so you can say they don't need to? =/
KISS principles apply. Why make it complicated?
 
I don't know if they convert to cartesian coordinates or if they remain polar. But consider Horizontal polar coordinates given by arc (theta_x) and a radius(r_x). Now you have another plane oriented vertically with theta_y and r_y. For this to work r_x=r_y but theta_x and theta_y can be different. Thus you have 3 variables that you can use to describe a position in 3d space relative around a point (in this case the base station).

Of course the API probably combines all the coordinates determined from the base stations as a single set of Cartesian coordinates.
Ah, okay. Yeah, the coordinates all being beacon-relative makes sense, but I wasn't sure how that information would be useful, given the fact you have no idea where the beacon actually was. But I suppose you can just look at where the headset is when the system comes online, and arbitrarily define that location as 0,0,0,0º,0º in your "standard" coordinate system.

As I said I don't really know but I could see the sync beam that acts as a preamble to the laser sweep have a has a randomized pulse train. It could also be some other method.
Yeah, that makes sense.

KISS principles apply. Why make it complicated?
To me, making the beacons communicative was the simple solution. lol

Anyway, thanks for the hand-holding. Much appreciated. <3
 

Stiler

Member
gloves suck for the following reasons

* Sensors are still to large to make gloves skin tight.
* Gloves need to be powered
* Gloves can irritate the hands (sweat, itch)
* To have comfortable gloves, they need to be custom made for your hand.
* You need to be able to wash the gloves.

It seems like such a great idea.. but its not something that is the best solution right now.

This kind of technology already exists, and it's been used in motion capture work for some years, see:

https://www.youtube.com/watch?v=yX-6vwRCMnI
https://www.youtube.com/watch?v=BdCDOhD_7aA

Look at those videos, the gloves are no where near "bulky" at all, on top of that the bottom is mesh and allows air in so your hand doesn't get that hot or sweaty.

Also they are powered by it is done via a controller you strap to your arm and the data is transmitted (via bluetooth or such) to the pc so no wires to get tangled in or anything to keep you from moving however you want and generlaly can hotswap the battery when it's low.

I could see VAlve or Oculus easily adopting this kind of thing if they put work into making it work with their sensors and how they track the headset or such with lighthouse in Valve's case.
 

kyser73

Member
It can't, really. The reason strafing makes you feel sick is because it's actually an incredibly unnatural movement. You don't really strafe in real life. Mostly, you just walk or run forwards, and in a pinch, you can clumsily shuffle backwards, or even to the side a bit.

One type of movement in VR that doesn't produce much-if-any sickness is something called ratcheting, which controllers like Move, Touch, or whatever the Vive controller is called can help with. Basically, you just reach out, grab yourself a big fistful of air, and pull yourself towards it. Imagine being in a zero-gravity environment, holding a handrail attached to the wall. You can move yourself up and down, forwards and backwards, or left and right. Now imagine the same thing, but instead of being restricted to the handrails, you can reach out and grab "nothing" instead. It seems like a bizarre way to move yourself around, but it's actually really comfortable, because there's no abstraction in the movement; your body is moving exactly as it should, based on the movement of your hand.

Yeah I've seen that and the 'snap' rotation process (which mimics the head movement of a spinning ballerina or figure skater), but until there's a way of combining head movement with motion for directed movement the issue will remain.

Still, I guess if you make someone believe they are on a hoverboard you could probably get round it!
 

Raticus79

Seek victory, not fairness
That would make sense, but it seems like kind of a 20th Century solution. Why not just give the beacons the ability to communicate? Just so you can say they don't need to? =/

Cheap, simple, reliable. The communication idea could work too of course - they could have a sensor on them so that each lighthouse could sense when it was receiving beams on the same channel it's sending, choosing a random new channel if a conflict is detected. That just requires some more logic to be built into the chip in the lighthouse. I'll be curious to see how they handle it.
 

Durante

Member
I really am a believer in lighthouse as a room-scale positioning technology. It's extremely fast, precise, and just plain neat. I hope they can also make it sufficiently cheap to build.

Cheap, simple, reliable. The communication idea could work too of course - they could have a sensor on them so that each lighthouse could sense when it was receiving beams on the same channel it's sending, choosing a random new channel if a conflict is detected. That just requires some more logic to be built into the chip in the lighthouse. I'll be curious to see how they handle it.
They actually stated that the system is easily extensible to more lighthouses, which would favour the interference-sensing idea I guess.
 

tuxfool

Banned
Wouldn't that require some level of coordination/communication? If every beacon needs to be on a different frequency or whatever, how does a given beacon know which frequency it's supposed to use, or which are unavailable? Once the beacons decide what they're gonna do, how does the system know which one is which?

Cheap, simple, reliable. The communication idea could work too of course - they could have a sensor on them so that each lighthouse could sense when it was receiving beams on the same channel it's sending, choosing a random new channel if a conflict is detected. That just requires some more logic to be built into the chip in the lighthouse. I'll be curious to see how they handle it.

This video answers that question:

https://youtu.be/xrsUMEbLtOs?t=371

For some reason I forgot about it when I first watched it.
 
This video answers that question:

https://youtu.be/xrsUMEbLtOs?t=371

For some reason I forgot about it when I first watched it.
Ha! I forgot that part too. So, the setup they had at GDC had all of the beacons synced with each other via wire, but you can also make them operate independently by tuning them to different timings and/or frequencies, probably with a knob on the beacon itself, as Raticus speculated.

Mystery solved!
 

Vilam

Maxis Redwood
I couldn't disagree more with their assessment that packed in motion controls is a good thing. That's far removed from the VR experience I'm looking for. I want already great games made more immersive via head controlled camera movement... not inherently limited motion control scenarios.
 

tuxfool

Banned
I couldn't disagree more with their assessment that packed in motion controls is a good thing. That's far removed from the VR experience I'm looking for. I want already great games made more immersive via head controlled camera movement... not inherently limited motion control scenarios.

That's great. You can use a gamepad then.

In case you need to point and aim in vr you get the massive immersion of moving thumbsticks.
 

Vilam

Maxis Redwood
That's great. You can use a gamepad then.

In case you need to point and aim in vr you get the massive immersion of moving thumbsticks.

Having the camera mimic my head movement is immersive - it's controlling an aspect of a game the exact same way I would in real life. Holding a motion controller in my hand is not, and doesn't mimic the way I interact with anything in real life. There's a huge difference between the two. I'll have a much better time with a traditional controller thanks.
 

Nzyme32

Member
Having the camera mimic my head movement is immersive - it's controlling an aspect of a game the exact same way I would in real life. Holding a motion controller in my hand is not, and doesn't mimic the way I interact with anything in real life. There's a huge difference between the two. I'll have a much better time with a traditional controller thanks.

Actually you'd be wrong on that in a way; that is part of the reason why the Vive demos were so well received. It is extremely intuitive since the controllers match your movements 1:1 unlike the typical motion controllers, and using the controls they provide (in what is shown so far) works as people expect. The controls in terms of buttons and the touchpad mimic the actions you would do normally - squeeze to squeeze / hold, trigger for index finger use, thumb pad for thumb use (used together to pick objects). In the virtual world the controller changes to represent what ever needs to be used and the thumb pads can show your thumb position 1:1 when using it
 

LilJoka

Member
I haven't followed everything closely but I know I want a Vive.
Have most people started to dismiss Oculus? do they have any upper hand that's we know of?
 

tuxfool

Banned
I haven't followed everything closely but I know I want a Vive.
Have most people started to dismiss Oculus? do they have any upper hand that's we know of?

All those exclusives they bought out, maybe?

Oh, also a major get in terms of Xbox One streaming.
 

hesido

Member
I couldn't disagree more with their assessment that packed in motion controls is a good thing. That's far removed from the VR experience I'm looking for. I want already great games made more immersive via head controlled camera movement... not inherently limited motion control scenarios.

I can't make sense of this at all. A standard of interaction is required at the very least especially when you are not even seeing the controller in your hand. Devs can rely on you having that specific controller. It makes life easier for everyone involved.
 

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
Having the camera mimic my head movement is immersive - it's controlling an aspect of a game the exact same way I would in real life. Holding a motion controller in my hand is not, and doesn't mimic the way I interact with anything in real life. There's a huge difference between the two. I'll have a much better time with a traditional controller thanks.

This only makes sense if you don't have arms and hands I guess?
 

Nzyme32

Member
All those exclusives they bought out, maybe?

Oh, also a major get in terms of Xbox One streaming.

I really have to debate how much of a get Xbox One streaming for the Rift actually is. In theory the Vive or any SteamVR headset can do exactly the same thing, since the functionality is a part of Windows 10. All that needs to be done is run the program as a non-Steam app or via a Virtual desktop feature so they be viewed in the SteamVR BPM.
 

tuxfool

Banned
I really have to debate how much of a get Xbox One streaming for the Rift actually is. In theory the Vive or any SteamVR headset can do exactly the same thing, since the functionality is a part of Windows 10. All that needs to be done is run the program as a non-Steam app or via a Virtual desktop feature so they be viewed in the SteamVR BPM.

I was being facetious (demonstrated by italics).The streaming thing is dumb as hell, it is xb1 streaming munged together with a tech demo. I could see people wanting to stream xb1 games to things like tablets, but to a computer capable of running oculus?
 

Nzyme32

Member
I was being facetious (demonstrated by italics).The streaming thing is dumb as hell, it is xb1 streaming munged together with a tech demo. I could see people wanting to stream xb1 games to things like tablets, but to a computer capable of running oculus?

Yeah exactly, but I would keep in mind that the specs for doing such things could actually be really low. Palmer Luckey said they have experiences running on integrated graphics, and we have already seen the same "cinema" feature on GearVR - since the Xbox One streaming is just a video feed into that simplistic room, it may very well be possible on low end PCs.

However ultimately this feature is pointless other than to Xbox One users, and I'd take a guess they are going to be a real target going forward, so I suspect there will be some sort of Oculus capability for Xbox One (insert pr blurb of also a Win10 device) / maybe even a separate lower end Rift for the Xbox One & Win 10 with lower price as well as a high end one later bundled with the Touch controllers. They slam some Minecraft as an exclusive, and they have an instant market
 

Vilam

Maxis Redwood
Actually you'd be wrong on that in a way; that is part of the reason why the Vive demos were so well received. It is extremely intuitive since the controllers match your movements 1:1 unlike the typical motion controllers, and using the controls they provide (in what is shown so far) works as people expect. The controls in terms of buttons and the touchpad mimic the actions you would do normally - squeeze to squeeze / hold, trigger for index finger use, thumb pad for thumb use (used together to pick objects). In the virtual world the controller changes to represent what ever needs to be used and the thumb pads can show your thumb position 1:1 when using it

I'm not sure what you think 1:1 control is, but holding a controller that you wave around, press buttons, and use gestures on to mimic things you'd be doing in real life isn't it.


This only makes sense if you don't have arms and hands I guess?

By all means, strap some Vive controllers to your hands and see if you're able to get through your normal daily life. Holding an input device isn't analogous to how we approach real life. Until we have something that can accomplish that, I'd much prefer holding the next best thing - a standard controller.
 

tuxfool

Banned
By all means, strap some Vive controllers to your hands and see if you're able to get through your normal daily life. Holding an input device isn't analogous to how we approach real life. Until we have something that can accomplish that, I'd much prefer holding the next best thing - a standard controller.

This implies that a standard controller is the best thing for all applications. It isn't. It is an adequate thing for some applications; there are other controllers for other applications.

This can clearly be seen in the mouse/controller disconnect. Personally, I think controllers are garbage for FPSes and can't stand their inaccuracy, I don't like them for flight/space sims and nobody attempts complex management/RTS/sim interfaces with a controller.

This is likewise true for dedicated VR controllers.
 

Krejlooc

Banned
I'm not sure what you think 1:1 control is, but holding a controller that you wave around, press buttons, and use gestures on to mimic things you'd be doing in real life isn't it.

1:1 refers to a a linear mapping of internal VR space to an equivalent real-world space that never wavers.

These things don't use gesture tracking.

By all means, strap some Vive controllers to your hands and see if you're able to get through your normal daily life. Holding an input device isn't analogous to how we approach real life. Until we have something that can accomplish that, I'd much prefer holding the next best thing - a standard controller.

By all means, try getting through normal daily life using only your thumbs on a video game controller.
 

Vilam

Maxis Redwood
By all means, try getting through normal daily life using only your thumbs on a video game controller.

I never suggested you could, while others here are tripping over themselves to rush to the defense over how much more immersive motion controls will be. Until there's a solution that doesn't require holding something, then I prefer to stick with the control method that works best over the widest variety of types of games - a traditional controller, or a keyboard / mouse. That's going to be the best way to experience the types of games that I'm interested in playing with VR.
 

Krejlooc

Banned
I never suggested you could, while others here are tripping over themselves to rush to the defense over how much more immersive motion controls will be. Until there's a solution that doesn't require holding something, then I prefer to stick with the control method that works best over the widest variety of types of games - a traditional controller, or a keyboard / mouse. That's going to be the best way to experience the types of games that I'm interested in playing with VR.

Your suggestion is literally "the perfect is the enemy of the good"
 

kyser73

Member
I never suggested you could, while others here are tripping over themselves to rush to the defense over how much more immersive motion controls will be. Until there's a solution that doesn't require holding something, then I prefer to stick with the control method that works best over the widest variety of types of games - a traditional controller, or a keyboard / mouse. That's going to be the best way to experience the types of games that I'm interested in playing with VR.

Have you tried VR?

How do you know this?
 

Vilam

Maxis Redwood
Have you tried VR?

How do you know this?

Yes I have. While I haven't tried it in combination with motion controls, I can quite comfortably say that I've disliked motion controls with traditional game experiences that I've tried previously, and nothing about combining them with VR is going to change the aspect of them that I dislike.
 

Seiru

Banned
Yes I have. While I haven't tried it in combination with motion controls, I can quite comfortably say that I've disliked motion controls with traditional game experiences that I've tried previously, and nothing about combining them with VR is going to change the aspect of them that I dislike.

I feel as if you are vastly underestimating the huge difference between motion controls on a 2D screen with the Wii/PS3, and sub-millimeter accuracy motion controls within VR. One is clunky, the other feels like an extension of your body.
 

Armaros

Member
I feel as if you are vastly underestimating the huge difference between motion controls on a 2D screen with the Wii/PS3, and sub-millimeter accuracy motion controls within VR. One is clunky, the other feels like an extension of your body.

If its not perfect we should more steps backwards.
 
YfPH9a4.jpg

http://www.reddit.com/r/oculus/comments/39r9s5/lighthouse_tracking_volume_irl/

Each of the lighthouses have a 120 degree horizontal and vertical range of projection (similar to the fov of a GoPro set to wide with a 4:3 aspect ratio) so they can actually see beyond the orange square. Every location that you see the controllers in the image is tracking.

I think this is as far as we're going to go with measuring the limitations of the setup; we're developers, not a techblog. The tracking volume shown in that picture works well beyond what we actually need it to and wanted to share that with everyone on here since there has been a lot of misinformation about the tracking space.

I'm sure some journalists will get their hands on one soon and measure it all down to the millimeter to find out EXACTLY what the range of everything is (though I think it will act differently depending on a lot of different factors), but this setup works for us!

and post by Alan Yates of Valve

http://www.reddit.com/r/oculus/comments/39p6wa/its_a_bit_premature_to_judge_the_quality_of/cs5ptcn

You should also remember the volume is 3-dimensional, so it is a bit more complex than that. You can do crazy things like put a base station on a high ceiling pointed straight down. The overlap areas are where you have redundancy for occlusion, controllers, etc but each complete independent base station frustum still offers tracking, just higher probability of drop-outs. The recommendations in the developer edition set-up guide are just that; recommendations. They don't represent ultimate performance capabilities of the system. Engineering 101 is never to run systems at the edge of their performance envelopes anyway, you should always have a spare 3 dB or so up your sleeve.

I expect most users will put their base stations or cameras in convenient places, on shelves or tables, especially in the initial few hours after unboxing. Some will leave them there, some will install them more permanently. Some will dedicate spaces to VR and optimise them to their budget and taste. All of these configurations will work fine. If there is a Bermuda Triangle of crappy tracking in one corner people will avoid it or fix their setup if it interferes with the games they like to play. I expect tools for mapping tracking performance and orienting playing area will evolve quickly.

Tracked objects are also not isotropic, some track better in some orientations than others, we specifically design the sensor constellations to distribute performance as evenly as possible. In general it is extremely difficult to put a performance figure on triangulating tracking, it varies over the volume and is non-isotropic. RMS figures are often misleading without specified conditions of measurement. It is actually quite hard to measure tracking performance because it is a 6-dimensional field.

Also, the current base station implementation is not the only way to implement a Lighthouse. Each rotor can have essentially 360 degree azimuth coverage, the current sweep is limited to about 124 degrees by the housing. The constraint on elevation angles with respect to the rotor is largely optics and housing too, but there are ways to extend that to almost 180 degrees, making a base station basically omnidirectional. Lighthouse receivers have the capacity to work with different base station designs, we have specifically made them as generic as practical to allow rapid improvement in base station architecture in the future.

Ultimately tracking, Lighthouse or Constellation, is not the limiting factor on VR development. Developer imagination is. Everything we (Oculus and Sony too) are doing in the hardware space is to give developers the tools they need to enable awesome content. There will be a lot of experiments, most will fail. No one really knows how to use full volumetric entertainment yet.
 
some more Yates posts

They actually use a pair of CR123A rechargeable lithium batteries.

It was a joint effort, more than half the developer edition hardware Valve was responsible for supplying, including many sub-components hTC integrated. Almost all the tech in the Vive system is Valve R&D. But there is no question hTC is vastly more experienced at making beautiful quality hardware on impressively fast turns than Valve currently is, that's why they are great partners.

You absolutely need to have both Lighthouses mounted for it to work? Ummm, NO. They have feet, you can just place them on a table or shelf - but yes you can mount them up high looking down for best performance, just like a camera.

All I was saying is you don't NEED to mount base stations and you don't NEED two, any more than with cameras. Lighthouse's advantages are mostly scalability, embeddability and privacy. I'd argue it is also easier to set up with less wiring and a better choice for tracking self-contained mobile devices until natural feature tracking matures.

Holy crap, I'm impressed by that sword!

And that controller still tracking up on the mezzanine level...

The Vive release base stations will be at least as capable as the developer edition. The exact configuration shipped is up to hTC, but 120x120 degrees & 5 metres is a pretty good bet.
 

Mononoke

Banned
As someone that didn't even know Vive was out in 2015, im 100% on board with Vive now. Appreciate all the info posted here. I am so excited for VR.
 

mrklaw

MrArseFace
the trick with vive likely launching with the 3D controllers is they get the best of both worlds.

Developers can develop on the assumption that everyone has those, so if a game really needs that motion tracking, no problem - you don't even need to code a fallback for an xbox controller. Also, if your game works well on a normal controller, I'd argue you could make that mandatory too - enough people will have some form of controller, or can easily buy one (whether 360, XB1, DS4 or other) that you can rely on that too.

Oculus are shipping something as the 'standard' controller that so many people already have it will be redundant for a high percentage of likely early adopters. And simultaneously by not shipping their Oculus touch, no developers will be able to rely on those
 

Business

Member
Maybe Microsoft should have upgraded the Bone pad with motion tracking now that they updated it with the 3.5mm jack, would have helped their partnership with Oculus.
 

Man

Member
Maybe Microsoft should have upgraded the Bone pad with motion tracking now that they updated it with the 3.5mm jack, would have helped their partnership with Oculus.
Would have been a fragmented implementation on the Xbox side of things.
 
Top Bottom