The vast majority of the interview focuses on forward-looking predictions for brain computer interface technology (BCIs) which is hardware that is able to interface directly with your brain signals to detect emotional responses, feelings, and more.
“We’re working on an open source project so that everybody can have high-resolution [brain signal] read technologies built into headsets, in a bunch of different modalities,” Newell said. “If you’re a software developer in 2022 who doesn’t have one of these in your test lab, you’re making a silly mistake…software developers for interactive experience[s] — you’ll be absolutely using one of these modified VR head straps to be doing that routinely — simply because there’s too much useful data.”
There are two specific benefits that VR would have from this discussed by Newell. For starters, it could significantly enhance immersion, such as increasing difficulty dynamically if a player is getting bored or feeling unchallenged. Or perhaps in a procedural game if the BCI notices when a randomized layout is something you dislike or particularly enjoy.
Newell then goes on to explain that in the future, BCIs will enable the creation of virtual worlds that far exceed our perceptions of reality, stating that, “the real world will stop being the metric that we apply to the best possible visual fidelity.”
Near the end there is also a discussion of how BCIs in VR can essentially solve for VR motion sickness, or that feeling of vertigo that makes some users nauseous during particular types of artificial movement. The feeling can already be suppressed artificially. “It’s more of a certification issue than it is a scientific issue,” explains Newell.
Perhaps this is why Valve has been so quiet about their plans post-Valve Index — they’re hard at work on what’s coming for the next-generation of interacting with computers.