Is this the future of consumer displays? Even if not, the development is fun to watch, which you can start doing at mozvr.com -- and if you're serious about learning about this project you may want to read our interview transcript in addition to watching the video, because the transcript contains additional information.
Slashdot: So Rabimba, can you explain the system that you are standing in front of here, which is a laptop with some unusual additions here?
Rabimba Karanjai: Yep. So what you see in front of you, this is an off-the-shelf Razer gaming laptop. The reason we are using that because it’s the VR which we are showing the video and everything requires a fair amount of GPU usage, and this laptop has pretty good GPU. But other than that it works on every other setup. So what you see right now is an off the shelf Oculus VR kit, this is a development kit tool. It’s is paired with some other hardware for example a Leap sensor which does something cool, which I'm going to show after a while.
So right now what you see is an immersive video where once I change the view of the Oculus VR, so you can see how it changes and you can do a 360 degree real time video of everything you see and when somebody actually wears that, what he will see is something kind of like this inside their eye. So the left eye will see that, the right eye will see this, with different angles, slight angle changes and that actually makes it much more believable that he is in the real scenario. So everything runs within the browser and we also have something called Rainbow Membrane, which is kind of a demo we are trying to show off which is a virtual room, which as you can see we have different walls and everything. Also we have Statue, which kind of looks like Lincoln. And this is I'm putting my hand, so if somebody or me wearing it, I can actually touch it, I can meddle with it and it will catch it. So that's my hand that's what I'm doing with it. So just imagine when you're wearing it and you’re doing it, you have a full immersive experience.
Now the reason we're doing this and as part of Mozilla is that we're trying to lower the barrier of entry for web developers or other developers, so that they can try the VR content and they can generate more virtual reality content, just not without learning much or without learning curve. So right now this is running a nightly build of Firefox and you can actually right now go to mozvr.com, and even try this on. We have GitHub with all these demos, you can try it on your own machine if you have the Oculus VR. So this is like building block of the demo where people can just get the code and try hacking on, and start building stuff on it. So why we are actually concentrating much more on this, because we see vast potential on it.
Now for this demo, we're actually pairing the Oculus VR with the Leap Motion sensor, which tracks my hand, so those are the dots which are getting tracked and so the infrared sensors tell me that how far my hand is and based on that we can actually correlate, okay, this is how the interaction should be. Now imagine a virtual piano where with my hand if I am wearing it, I can actually do that, I can hear the sound that is much more immersive. So one of the use cases we actually thought about it is was just imagine a virtual classroom where people are from different parts of the world, they can just get in their single classroom, they can interact with each other and just imagine how immersive the Udacity or the online classrooms will be. So these are all use cases you can use this for and with no extra hardware and external things, Udacity and everything, they already have a web frontend, so they just have to plug this in so and they get going. And we are hoping the more other browsers will follow suit and they will enable this kind of technology in their own browsers.
Slashdot: Now besides the Leap Motion controller and the Oculus here, there's a camera on top, can you explain the functions that this has?
Rabimba Karanjai: So this is an Oculus camera, so this actually provides us the distance what we are from the place. So once if I am here and if I’m in the far place, it actually helps it to build a room, so that it knows okay, this is – he – that person is that far from the room. So when I'm putting my hand it knows, okay, from which depth actually to touch. For example, if you see this demo, my hand is still in the front, but it’s still not touching it, but when I put it forward, so this sense of like how far is it, we're getting it from that. So we are pairing all these sensors and getting the data and the application is building the whole virtual reality environment for you.
Rabimba Karanjai: There are. So we're interacting with this Oculus Rift and Leap Motion sensor using their own hardware – I mean own driver software. So that is everything is already installed. So when I'm running this what happens in the background is that this is a server, which I actually ran before running it. Since everything is on my local host for this demo, so this is just a web server; it creates a server and everything goes inside that, that will not be required for web developers because they will put it in the web, so this will be handled by the server on the back end. But yeah, there are no other requirements. The only requirement which will be how the browser interacts with the device driver in the software. For that we already have something called VR Extension which is already enabled in the browser, but if you go to mozvr.com and you'll get like complete instruction that, okay, download this Nightly and enable this extension, so you can either download and add-on enable that extension or you can just go to about.config and enable the experimental extensions. So once it's out of the experimental stage, goes out from Nightly to Beta and then goes out to Stable then people don't have to do any of these.
Slashdot: So somebody who already has the Oculus Rift and the attendant drivers and the Leap Motion, there's no closed source software that needs to be involved here for someone to start developing?
Rabimba Karanjai: He can. So if somebody already has Oculus Rift and the device drivers obviously will come with that, he can actually just go to mozvr.com, download this Firefox build, after that he can see – so we have a GitHub page where we all the demos we’re showing today, with source code is available, so he can just download it, we have readme instructions like how to actually deploy this server in your own machine so that you have a build almost ready. So everything I have set up in this machine, he can set up in his own and he can start playing. So he can play with the demos immediately and then when he starts okay, now maybe I'll start tinkering with it, these are the building blocks, right, so he from the demo, he can see that okay, if I want to interact so how the – like interaction code is working, so how we're calculating the distance and everything. So he can take that and start building blocks, okay, so this is how I'm going to build my own app so.
Slashdot: Are any of your demos games?
Rabimba Karanjai: Not here, but we have web peer games on our other booth, so they're doing games but not with Oculus Rift, and this is the closest to the game I have because this involves interaction, but yeah not a full blown game anymore. So just an interaction based game. We don't have a game. Probably we should have for future demos.
Slashdot: Probably by this time next year there will be.
Rabimba Karanjai: Yeah, maybe.
Slashdot: Alright. Okay. Thanks very much.
Rabimba Karanjai: Thank you.