The University of Stony Brook, Nvidia and Adobe are presenting at Siggraph 2018 with their paper on infinite walking using Dynamic Saccadic Redirection. This is a really neat interpretation of the age old problem of redirected walking. The last “great” solution I encountered was at VRDC 17 presented by Mahdi Azmandian at the Mixed Reality Lab, USC Institute for Creative Technologies. Regretfully this approach still required a 30’x30′ area to handle any walking area. This will be the room space size as offered by the new Steam VR lighthouse tracking v2.0. However I have not had an opportunity to play with that tech yet.
The researchers at Stony Brook utilize an embedded eye within the HMD to track saccade eye movement. During this eye movement, evidently the environment can be rotated incrementally to keep the participant within a confined space without noticing the effect or causing nausea. I have no idea about the details of this project. The folks at Stony Brook are being hush-hush until Siggraph. I suppose we’ll just have to wait until August to understand what this new technology entails.