When I saw the frankly mind-blowing HoloLens demo on Microsoft's stage at E3 this year, I had a strong suspicion that at least some of it must've been smoke and mirrors. I could not understand how it was working. But having actually tried the E3 Minecraft HoloLens demo in its entirety at Minecon this weekend, plus a little more that there evidently was not time for onstage, I can verify that it's not bullshit. It really works.
The panning, the zooming, all the interactions with the world; that all functions just as shown. I was playing live with another player on a Surface tablet, through a demo that was definitely not scripted (I know this because I kept trying random things out, much to the consternation of my demonstrator). To begin with, after the prototype HoloLens was lowered onto my head, I was playing Minecraft with an Xbox 360 controller on a virtual screen that the HoloLens was displaying on the blank wall in front of me. I could change the size of the screen by saying things like "small TV" or "huge TV" (which was just impractically enormous).
The E3 HoloLens demonstration
When you look around the room, the HoloLens displays Minecraft blocks falling on all the surfaces around you so that you can see if it is calibrated correctly. There is a cursor in the centre of your vision that you can move around by looking; turn your gaze towards any flat, horizontal surface, say the words “create world”, and the Minecraft world rises out of it. It is, in the literal sense of the word, incredible – though, as with the Oculus Rift and other virtual reality headsets that I've tried, you get used to the astonishing things happening in front of your eyes remarkably quickly.
There is one important difference, which has been noted elsewhere: where the E3 demo gave the impression that the HoloLens display fills your full field of vision, in reality (at least for now, on the hardware I was using) it's more like there is a rectangular area in the centre of your vision through which you can see the virtual world. The headset itself has to be calibrated quite extensively - I had the space between my eyes measured and the headset fitted specifically to my head. I imagine that the final model will be different, but right now it’s like looking through a pair of dark sunglasses. You can still clearly see everything else in the room, but the virtual stuff looks much brighter than reality.
When you're looking at that world, panning left and right by bringing your thumb and forefinger together in front of your face and dragging, you don't feel so much like a player as like a god. You're an omniscient overseer, watching blocky little Minecraft characters walk around, able to see the caverns below them and the landscape around them. It's a little intoxicating. Your actual influence over the world beyond manipulating your view of it is limited, but also godlike: you can change the weather, send lightning strikes down upon the land, conjure beacons of light to guide the player.
The word “snowstorm” creates a blizzard, during which the falling flakes slowly gather on the ground. You can dispel it in an instant by saying the word “sunshine”. The voice commands are proximity-based and generally I found that the HoloLens would only respond to commands that I issued – even with a nine-year-old boy in the corner shouting “lightning strike!” enthusiastically every few minutes. The voice recognition is impressive; I had to repeat one or two commands, but even with my posh Scottish accent it had no trouble. It's certainly a hell of a lot better than Kinect has ever been at understanding what I'm saying.
It's better than Kinect at recognising gestures, too, though admittedly this demo didn't involve much gesticulation. The combination of voice and gesture control felt much more natural than gesture alone. It had me imagining how much better Kinect could have been if you had been able to say menu options in order to select them, rather than fannying around with your arms outstretched.
I greatly enjoyed the experience of walking around the physical table to get a different perspective on the virtual world was sitting on it. You can physically peer closer or tell the HoloLens to zoom in to get a better view of things. You can indeed look inside buildings, which is a bit of a mindfuck because your brain thinks that your face is about to collide with something.
This image is also kind of bullshit.
The only comparable experience to this first demo of HoloLens was the first time I tried Oculus Rift, except after that I felt really sick. HoloLens is a gentler introduction to the concept of virtual reality (or, technically, augmented reality). I haven't tried HTC Vive yet, but being able to walk around in actual space makes HoloLens feel much more natural than other VR/AR tech I’ve tried. Even within the confines of Minecraft, exploring the amazing, detailed worlds that its millions of players have created from this new perspective is an exciting prospect.
I'll be greatly interested to see if the final model of the HoloLens works as well as this. I have not-so-fond memories of the first demonstrations of Kinect, which was subsequently downgraded: the first version of it that I ever saw still recognised finger movements, whereas Xbox 360’s retail Kinect camera had trouble with limbs. If it follows the trajectory of the Oculus Rift technology, though, the final version might be considerably better, not worse.
Both Minecraft and HoloLens are Microsoft properties that go far beyond Xbox and far beyond gaming. Like Microsoft's operating systems and office software, they are tools whose appeal and usefulness could extend far beyond our little corner of the entertainment world; Minecraft is already being used by the UN in city planning, and the potential non-gaming applications for HoloLens are vast. I expect that the two of them make up a large part of Microsoft's plan for the future.