The Void, or something like it, is going to make a billion dollars.
Recently, I had the opportunity to actually fly out to Utah and see the experience that they’re developing. The Void is visibly unfinished, half broken, and one of the coolest, craziest things I’ve ever experienced.
Of all of my experiences in VR, this was the one that came closest to filling the hole in my heart left by the realization, at twelve years old, that I would never have a Holodeck.
The Void’s office sits in a sparsely populated industrial area a little outside of Provo, Utah. I arrived at about 1:30 in the afternoon, spotting the logo out the window of my Uber. Aside from the little logo in the window, there isn’t much to distinguish it from a dentist’s office or insurance company.
Through the door, I was greeted by a handful of engineers and artists, talking in the break room. After I introduced myself, one of the company’s founders, Curtis Hickman, took me aside while we waited for them to calibrate the tracking system. We talked briefly about the Void’s path to commercialization, and he showed me several 3D-printed prototypes — one of the Void’s HMD and one of the gun, which contains a solenoid and a vibrating motor for force-feedback.
He also showed me their original prototype, dubbed “Dark Helmet”, which was a massive, ungainly construction of wires, chips, black duct tape, and mesh.
I wasn’t allowed to take a picture of it, unfortunately, as the Void doesn’t want to show it off until they have the polished final version to compare it to. Curtis compared it to showing off the chubby “before” picture without yet having the trim “after” shot.
He also showed me the dummy 3D-printed version of the final VR helmet, dubbed the “Rapture“, that they used for the trailer. He also showed me CAD drawings of the final headset design, which actually included space for all the guts. The 3D printed dummy didn’t fit on my head.
“I have a large head,” I said.
“Wow, yeah you do,” said Curtis.
He thought about it for a moment, and then told me it would probably be fine.
Eventually, an engineer poked his head in to let us know that the demo was ready. Curtis lead me through a door to a space in the back of the office. There was a sign warning against taking photographs or videos. Luckily, I was allowed to take a few shots anyway, although most of them didn’t come out due to the dimness of the room.
After a few minutes, an engineer came into the room to let us know that the demo was ready. I stepped through a door, and found myself in an open room with a maze of chest high walls. I was fitted with a backpack and a helmet, containing a small PC and a DK2 respectively, linked by a thick cable. The helmet appeared to be a regular foam bike helmet, covered in tracking dots, with a DK2 strapped to the front. The backpack was likewise unfinished, and didn’t bear much resemblance to the promotional images of the final product.
The ceiling was covered in hundreds of mocap cameras used to track the helmet, although Curtis asked me to emphasize that this is not the final tracking technology that they’ll be using.
The helmet did end up fitting on my head, barely, but my glasses were a complete no-go, so I took them off. I’m pretty blind without my glasses, so for accuracy’s sake, imagine the rest of this demo as being really, really blurry.
On the floor, someone has stenciled the numbers 1, 2, and 3 about shoulder width apart. I was told to stand on ‘1’. I heard the sound of keys clicking, and then I was somewhere else. It wasn’t an incredibly exciting somewhere else – just a vague white and blue void. But, after a few seconds, a hole in space opened behind me, with pixelated edges.
Stepping through it took me into an Indiana Jones inspired temple world. I could hear birds chirping, and the sun shining. I reached out to a nearby wall, and touched it, watching a virtual version of my hand mime the motion, courtesy of the Leap Motion. The wall was there.
It felt like cheap foam, but it was there. I moved into a small courtyard with a little stone bench, enclosed by crumbling chest-high walls. A female voice in my ear told me to sit. I checked the bench with my hand first, but I sat, and it was solid. You can see the actual scene in the image below.
I moved on after some brief scene-setting dialog, venturing into a cavernous temple. At first, I moved cautiously, feeling like I was inching forward into the dark, hands in front of me. But after a few steps, I began to really understand that I could trust the virtual world, and I began moving confidently into the space. I’ve had lucid dreams a few times in my life. This felt a lot like that.
It was about this point where I started giggling.
This is real. This is really, really real.
A few steps into the dark, I came across a torch on the wall, and touched it to see if it was real. It was, and it came away in my hand, casting a light across the scene. It was glitchy when I held it in certain places, but was amazing to be able to manipulate a virtual object like that. Curtis, who I think was following behind me, asked me if I could feel the heat of the flames. I held my hand in the flames of the torch. It did, indeed, feel warm, presumably thanks to some heating element in the hardware.
I moved around a few corners and sprinted across a room while the floor collapsed behind me. Before moving on, I ducked back and touched my toe to the missing floor, just to verify that it was still there. Moving on, I stepped out onto a narrow cliff ledge overlooking an open cavern. I could see water and stalactites, although it was tough to see details without my glasses.
I felt a cool, moist breeze on my face, giving a breathtaking sense of space. crept along the ledge, and came to a rickety looking wooden bridge. I’ve run across them a thousand times in a dozen games without even thinking about it. This was different. I found myself moving one careful step at the time, putting my feet down on the sturdiest looking boards.
A rocky pillar stuck up beside the bridge, with a glowing handprint on it. I laid my hand down on it. Suddenly, the bridge began to rise, shaking the floor and lifting my hand off the pillar. That threw me for a moment. Were they really lifting the floor? I hadn’t seen an elevator. That couldn’t possibly be safe. It took a good several seconds for the mental Necker cube to flip.
Oh! They aren’t moving the floor. They moved the pillar.
When I walked the course later, without the VR gear, I found the pillar (and the “rickety bridge”), and got a better understanding of what had happened. You can see both in the image below.
The elevator brought me up a shallow cavern. I could see plants and sunlight overhead, and another entrance to the tomb ahead of me. There was a cool breeze coming down from above. I stepped in, and the demo ended. When I lifted the goggles off, I was more than a little surprised to discover that I was standing back where I had started. According to my mental map of the world, I was at least fifty feet away from where I’d started — and twenty feet up.
When I walked the course afterwards, I was amazed by how small it really is, just thirty feet by thirty feet — two rooms and two hallways. In total, it was maybe four or five times smaller than I would have guessed before I took the headset off. Curtis asked me not to talk in detail about their approach to redirected walking, which they are seeking to patent. Suffice it to say that it’s clever, it has its own limitations, and it works impressively well.
Here’s a demo of a more traditional redirected walking technology. This was tested in a basketball court, to give you an idea of the standard approach. In short, redirected walking works by taking your path through the world, and warping it so that what seems like a straight path is really a circle, allowing you to explore spaces that seem much bigger than they are.
The next demo was a fairly generic sci-fi adventure, routing me through narrow corridors where I shot at spiders with a tracked gun, released an alien from a capsule, and played with its corpse by poking it with my gun.
This demo was a lot rougher around the edges, with the frame rate dropping precipitously in the alien’s chamber, and generally felt a lot less immersive. The most interesting thing about it is that Curtis went through it with me. Within the HMD, I saw him as a bulky, generic space marine. The implementation was not great — his feet slid wildly around on the ground, and his arms were only tracked in relation to the gun. However, the overall effect was still pretty neat.
Afterwards, I sat down with Curtis to discuss the technology behind the Void. After the shock and awe wore off, I had a lot of technical quibbles with the system.
- The tracking had significantly higher latency than the latest Rift prototype, or even the DK2.
- The tracking on props dropped out a lot, and the positional tracking had a number of periodic little hitches.
- The calibration on several walls was also off, allowing me to poke my hand several inches inside them.
- There were also a couple of frame rate drops, producing judder – something that should really never happen in well-designed VR content running on known hardware.
- Lastly, the Leap Motion hand tracking was iffy right off the bat, and broke down instantly when I picked up the torch, I suspect due to IR interference.
Curtis explained that most of this was due to the unfinished nature of the hardware. The numerous mocap cameras have a bunch of issues with calibration — even heat fluctuations over the course of the day will throw them off.
The Void is developing its own tracking technology, based on precisely timed radio pulses from many transmitters, which they claim will neatly solve most of the calibration and latency issues. This technology is similar to how the Kinect tracks your body — except using specific tracking points to increase accuracy, and using radio instead of IR light, to allow the signals to pass through walls.
While I was there, I had the chance to meet “Krenzo,” a poster on MTBS3D who was hired by the VOID to develop the RF tracking technology. He was working on an entire table full of circuit boards and wires, set across from three tripods with large boards on them.
The tripods are the base stations, and the table covered in circuitry is the tracker. He points me to a little box, about the size of a matchbox, covered with ports and a small air intake — an obsolete prototype tracker. He told me that over the next couple of weeks, he is going to turn the table full of electronics into one of those little boxes. Eventually, they hope to get custom microchips made, further reducing the size to something truly tiny. For now, the matchbox will do for head tracking.
When I asked about range and accuracy, he told me that the chips are accurate down to about a third of a millimeter, and have a range of about fifteen feet. Then he stopped, and corrected himself, saying that fifteen feet is the limit set by the FCC, to avoid interfering with nearby wireless devices. I asked, half-joking, if they’d considered building a Faraday cage around the facility, and said that it was a possibility under discussion.
Later, Curtis told me that they hope to have a working prototype of the radio tracking system working by the end of next month, making me wish I’d delayed my trip a little to see it.
I also asked about the Rapture HMD, and the PC that drives it, and was told that the Rapture will use two curved OLED screens. Curtis told me that they were very lucky to have access to the technology, due to the extremely small number of companies that make them. He also told me that the helmet, which is made of both carbon fiber and a magnesium alloy, is designed to distribute weight more evenly (to leave the center of mass unchanged), and to survive being dropped.
This is important, because it’s an expensive piece of electronics that will be handled by dozens or hundreds of people a day. If it can be broken, someone will.
The PC that drives the rig is a “backtop” – a backpack mounted computer. It’s about the size of an old walkman tape recorder, weighs three pounds, and has two fans on the surface, reflecting the fact that it was originally developed to run two GPUs in VR SLI.
This was deemed too complicated, so for the time being, the hardware is driven by a single GTX 980M and an i7. Curtis told me that they were in talks with a GPU manufacturer – “one of the big ones” – that was close to announcing a new, substantially more powerful mobile GPU, which would be capable of driving these sort of high-end VR experiences in real time.
The batteries are obviously a problem, and Curtis acknowledged as much. The batteries limit the length of the experiences to half an hour, and currently weigh around three pounds — a substantial reduction over the batteries they used to use. All told, the battery, CPU, and haptic vest should weigh about six and a half pounds.
The prototype I used was hefty enough to be noticeable, but wasn’t particularly uncomfortable. I was able to move freely, and even kneel on the ground without disrupting it.
The Business Model
The company is looking to start commercial operations towards the end of summer next year. However, as Curtis says, “there’s a lot of question marks.” In other words, there are a lot of details, both about the technology and the practical business realities, that remain unsettled. Nobody seemed incredibly confident about that date, which is fair. This is a big business to launch, and it depends on inventing a lot of technologies from the ground up.
However, the team did seem confident that they could succeed. Curtis bragged a bit that one of the things that separates them from other VR startups is that they have a well-defined business model that can be profitable from day one. Rather than just diving into a futuristic technology without a clear idea of how to actually make money from it.
If nothing else, I’m convinced that the Void is offering a genuinely valuable experience, even to those who already own a high-end VR headset. There are experiences you can have in a warehouse that are just not possible in a living room, and I think people will be willing to pay for those experiences. The Void has a long way to go to solve its technical challenges, but the potential is overwhelming.
How about you? Are you interested in this kind of experience? How much would you pay for a half-hour experience? Let me know your thoughts in the comments!