I figured I had better be documenting iMyth progress as things move along.
I suppose I have been lax in doing so:
4/25 – My Vive arrives via Fed-Ex. No one is home to receive it :(.
4/26 – AT the end of the work day, I high-tail it to the Fed-Ex depot and pick up my Vive.
Yeah!
4/27 – After spending the bulk of the day with class and students, I start focusing on installing my Vive. It does not work. I put together an impassioned message to Vive Support explaining my dilemma.
4/28 – Muchos meetings all morning. AFter getting back from the meetings, I install the Vive on the Galley PC. It works! The Vive experience is pretty awesome!
4/29 – I try to apply the feedback I received from Vive Support. To make a long story short, my computer is not VR compatible. After all this time I thought it was :(. Boy do I look silly. I Guess that sort of explains why the Occulus refused to work on my machine as well. Bryant, the fello from CAH says my iMyth machine will be arriving from Alienware today. Gotta keep my fingers crossed.
We are standing at the precipice of explorations into VR.
I was originally hoping to be able to exploit the usage of the giant MOCAP stage at FIEA. However, One of the most important elements, getting the track data to the game has to be solved yet. I was hoping to use Vicon pegasus to do the lion’s share of work. However, a week has gone by and I still can’t get the UE4 plugin to work correctly. This has many implications.
The first of these implications is that we can’t use the tracking on the mocap stage. This would mean we would need to use the Vive’s tracking capabilities. There are some unknowns about this. Out of the box, how large is the range? There is a video where the folks at stress level zero are experimenting with about a ten yard difference between the length of the lighthouses:
This might work for the iMyth Prototype. However, we would not be able to use props and sets.
There is also another thought that the lighthouses can be hooked up to create a matrix of light emitters. This video interview with the lighthouse creator, Alan Yates, hints at some of the unlimited opportunity.
By this video, in concept, there could be an infinite number of lighthouses providing an infinite amount of coverage. Once again we have the same problem of not being able to use props. However, the video also hints that sensors can be placed on objects and the sensors could report their position. This would take some, ‘hacking” to figure out. However, i Imagine some kind of prop/set system could be created.
There is one final possibility which I know nothing about. There is the possibility we could use the vicon cameras as lighthouse surrogates. That means we could already have an array of lighthouses already in place. Once again, this implies that props and sets could not be tracked.
Right now, I’m thinking that the best solution will need to be a home brewed variance from Blade to UE4. We would need to abandon pegasus and write our own solution. Could this be done? Vicon thinks it can be done. We now would need to find an engineer to do the work for us.
Hoo wee, I’m really excited about this post. This is all about Disney getting into the Escape Game business.
This opportunity does give further confirmation that iMyth is on the right track.
While iMyth is not chasing after the escape game market per se, we are pursuing the immersive theme world market which seems to be the evolutionary next step. Escape games 2.0 and beyond!
Disney has:
Immersive Physical experiences
Collaboration with multiple participants and groovy interactors
The only thing they don’t have is variability and randomness. According to their description, “Although this particular event was themed to the idea of “preserving time,” The Escape Challenge can be completely customized and tailored to fit any group’s event theme, message or objective. The specially constructed set is fully mobile and transportable, meaning it can be built and installed in function space available onsite a Disney convention resort or theme park event venue.” This may be an indication the Disney is starting to work customization and variability into the experience as well. Whether or not they are setting the stage for emergent narrative has yet to be seen.
Part of my research and definitely a focus of iMyth is to be a generator of immersive theme world experiences. These experiences are not games but on-going interactions between the participant the current theme world. The theme world evolves and adjusts to ensure a rewarding and enriching experience for the participant.
With that thought in mind, enter in the Tosca AI engine created by a Canadian company, Evodant. I was turned on to them through the Gammasutra article, How one studio is building game AI to replicate a human storyteller. The article goes on explaining the goals and ideals of Evodant as they create the Tosca AI engine and their new game, Gyre:Maelstrom to show it off. Ultimately Tosca listens to the participant and watches their activities. From these behaviors, the engine creates subtext in which to influence the participant’s experience unique doctored for that individual. This is just the sort of tool, I believe iMyth is working towards. Combined with Grammar based Universes, technology such as this will help shape the experiences of tomorrow.