Lat month, in December, iMyth Demonstrated full body tracking for its interactor using the Perception Neuron tracking outfit. It was an impressive bit of engineering to get the system up and working and coordinated with the Vive and UE4. For the prototype example this tech strategy worked well. The participant understood the interactor and there was never any “strong” concerns about the problems with the tech.
Regretfully there were many Problems.
- The Perception Neuron was “sensitive”. Sometime it wanted to come out and play. Some days we could not get it to talk no matter what we did.
- IMU Drift. Because the Neuron suit was only implemented with IMU trackers the suit never really understood its world space location and the interactor kept drifting off.
- Spazzing out. Occasionally, the character was just sort of go into a seizure, freeze up or go a little crazy. For the prototype this was fine but we need a more consistent solution.
- Finger Tracking. Did the Perception neuron actually track the interactor’s fingers during the actual experience? Regretfully we never tested. I have a concern they did not work.
- The Perception Neuron Suit was actually owned by one of our Technical Producers, Andrew. We are very grateful Andrew allowed iMyth to use the suit. Regretfully the suite left when Andrew graduated.
In Mid-December, I also attended journeyed to Seattle to get Vive Lighthouse training. The Lighthouse id the tracking technology Valve uses to track all of the objects in the environment. With this technology we could track an almost infinite number of objects and build our own Vive Controllers. I immediately started thinking about how we could use this technology to easily track the interactor and participant.
It seems like a couple of other companies were thinking about the same thing. Steve Bowler at Cloadgate experimented with strapping Vive Controllers to his feet. He was able to get some pretty good results.
Steve’s solution was a great first start. However it was a bit problematic since he required an entirely new Vive setup to get the controllers. He did not have any waist tracking, (Important for tracking overall body movement). Also, controllers strapped to your feet are a bit awkward.
Now it seems like the folks at IKinema have taken things one step further and and created their own Vive tracker controllers which are mounted on the feet and waist.
This demonstration looks really sharp and claim to have no post-processing. I believe them. However the imYth solution is still not quite there. Project Orion may still take some time to become commercially available. Their solution required two controllers, one in each hand. For iMyth I am seeking a hands free solution so we would require a tracking device on the wrists and eventually some IMU finger tracking. Ikinema is also a Motion tracking company so I I would not be surprised if we wanted their technology, we would also probably need to purchase a suite of re-targeting tools which would be overkill for what we want. I don’t know. I’d like to get in contact with the Ikinema guys and investigate their plans for release.
So this leads ups back to iMyth creating their own tracking solution. I just received an email from Reid at Triad Semiconductor saying they just received the go ahead to produce the boards needed to create custom controllers. Those will be available in 8 weeks. This will be exciting since iMyth will not only be tracking the interactor and participant but multiple other participants and multiple set pieces and props as well. Very exciting times!