BinaryVR will be producing an SDK which will do limited facial tracking based on chin, mouth and cheek motions.
This seems like a pretty good idea. The demonstrations I have seen of this while it was still at USC have been promising. However, there does seem to be cert limitations that are making it impractical fro iMyth. The first of these is that the it is only for the Oculus and the second the SDK is only available for Unity. If I had unlimited time I would definitley look into this now. However, I have have other pressing priorities.
I don’t normally write about improvements in technology. I would rather focus on Story and experience related topics. However this is groundbreaking and will have a significant impact on the HTC Vive.
A Chinese company called TPCast has created a device add-on for the HTC Vive making it wireless. While some folks may balk at the $220 price tag on top of the $800 they already plunked down for the Vive, this may just be one of the key components needed to help get the VR Cade concept of the ground This is especially true with iMyth.
One of the distractions for iMyth is we have created a 20’x20′ play space area. Regretfully this is longer than the actual cable is able to reach. We were planning to look into expanding the cable. However, for the new year, this wireless solution will be much better. Since the cable would have been very expensive, this more expensive alternative will be a welcome addition to the iMyth arsenal.
Folks may not want to pay extra to have this wireless luxury at home. However, peripherals and accessories such as this may be just the hook to entice people to get out of their homes and have an iMyth experience!
All in all the Polygon folks seemed very excited. The Void guys described their plans for world domination. This is very exciting. Is Imyth a Wannabe? Sure! However. I have a hunch this is going to be big business. We may not be the first pioneers, but we can create great experiences!
I have not gone through all of the sub-links and articles but there is a lot of talk of combing games with traditional media but not a whole lot of substance.
It looks like the cat is out of the Bag. One of these Website VR channels had an opportunity to talk with Charlie Hughes at UCF.
In the the article they constant refer to the use of interactors. More specifically the article addresses the use of “smart” puppets for controlling multiple interactors at the same time. This is very similar to what we have proposed.
We are standing at the precipice of explorations into VR.
I was originally hoping to be able to exploit the usage of the giant MOCAP stage at FIEA. However, One of the most important elements, getting the track data to the game has to be solved yet. I was hoping to use Vicon pegasus to do the lion’s share of work. However, a week has gone by and I still can’t get the UE4 plugin to work correctly. This has many implications.
The first of these implications is that we can’t use the tracking on the mocap stage. This would mean we would need to use the Vive’s tracking capabilities. There are some unknowns about this. Out of the box, how large is the range? There is a video where the folks at stress level zero are experimenting with about a ten yard difference between the length of the lighthouses:
This might work for the iMyth Prototype. However, we would not be able to use props and sets.
There is also another thought that the lighthouses can be hooked up to create a matrix of light emitters. This video interview with the lighthouse creator, Alan Yates, hints at some of the unlimited opportunity.
By this video, in concept, there could be an infinite number of lighthouses providing an infinite amount of coverage. Once again we have the same problem of not being able to use props. However, the video also hints that sensors can be placed on objects and the sensors could report their position. This would take some, ‘hacking” to figure out. However, i Imagine some kind of prop/set system could be created.
There is one final possibility which I know nothing about. There is the possibility we could use the vicon cameras as lighthouse surrogates. That means we could already have an array of lighthouses already in place. Once again, this implies that props and sets could not be tracked.
Right now, I’m thinking that the best solution will need to be a home brewed variance from Blade to UE4. We would need to abandon pegasus and write our own solution. Could this be done? Vicon thinks it can be done. We now would need to find an engineer to do the work for us.
Part of my research and definitely a focus of iMyth is to be a generator of immersive theme world experiences. These experiences are not games but on-going interactions between the participant the current theme world. The theme world evolves and adjusts to ensure a rewarding and enriching experience for the participant.
With that thought in mind, enter in the Tosca AI engine created by a Canadian company, Evodant. I was turned on to them through the Gammasutra article, How one studio is building game AI to replicate a human storyteller. The article goes on explaining the goals and ideals of Evodant as they create the Tosca AI engine and their new game, Gyre:Maelstrom to show it off. Ultimately Tosca listens to the participant and watches their activities. From these behaviors, the engine creates subtext in which to influence the participant’s experience unique doctored for that individual. This is just the sort of tool, I believe iMyth is working towards. Combined with Grammar based Universes, technology such as this will help shape the experiences of tomorrow.