Tag: iMyth

  • Disney Contributing to Location-Based Experiences

    Disney Contributing to Location-Based Experiences

    I like to stay abreast with technological developments which not only advance location-based entertainment but interactive storytelling as well. This new announcement from Disney sounds interesting. Disney research just filed a patent for a new head mounted display(HMD) and a gizmo which is referred to as an “Air Flow Generator”. I have no information about the HMD but the air-flow generator sounds interesting.

    This air-flow generator evidently generates gusts of directed airfields which can be used to simulate the haptic sensation of the movement of a virtual object such as sword swinging, animal moving etc. In addition, the generator can also manipulate the smell of the generated air gusts to simulate particular smells such as the smell of soil, smell of flower, etc.

    For sure there are many other air field generators, (fans), which can be actuated by trigger events within an experience. However, these have always been very “low frequency, high amplitude”, for lack of better terms, sensations. This generator sounds very localized and directed. There have also been other manufacturers of HMD attachments promising custom generated scents. This gizmo, as an external generator, promotes more of a collaborative, shared experience. The Void produces similar sensations in their experiences. I am unfamiliar with their technology.

    In a conversation with an Imagineer many years ago, I think Disney is on to a product which can really contribute to large scale experiences. He told me exactly how such a gizmo exactly like this could be implemented. Maybe he decided to have Disney research actually implement it?

    If this product is what I think it is then I believe it could make a significant contribution for the location-based entertainment market. An apparatus such as this could not be marketed for home use. If location-based immersive experiences are to be bigger, bolder and more fantastic than home based experiences, then this technology could aid in widening the gap between home and location-based. Of course there is a huge dependency on how reconfigurable this device is. However, if it fully reconfigurable and “dynamic”, then it will contribute to an experience that is physical, collaborative and highly memorable, (Always unique) – Just what the iMyth team ordered.

  • No Man’s Sky to fulfill my prophecy?

    No Man’s Sky to fulfill my prophecy?

    Quite a few years ago, I was very excited about a project called No Man’s Sky, No Man’s Sky and No Man’s Sky Grammar Based Universe. For me this represented the very first wide-open Grammar Based Universe. You may be asking, “What is a grammar based universe?” If you follow the prior link you’ll be inundated by a long winded answer. For the short attention spanned present, a grammar based universe is a fancy way of describing a procedurally-driven world. There is a lot of confusion between the terms Procedural and Automated, and there are some very big differences. Procedural means the implementation of a set of rules and expressions which complement an artists’ skills far beyond what she could create unaided. The important factor here is without human input, there can be no procedurally created worlds. Automation is allowing the computer to create with human assistance. Without going into too much theoretical detail, automation cannot efficiently create realistic CGI worlds. Only when the processes are procedural can life be breathed into the CGI creations.

    No Man’s Sky represented the very first Grammar Based Universe or an entire alternative reality created by artists with the aid of a set, (a very complicated set), of rules and expressions. Regretfully, when the game came out, the gamers were not thrilled. While the Universe itself was impressive, the game-play was not. I did not purchase the game myself so I cannot judge for myself. However, No Man’s Sky appears to be rising from the ashes and being re-released as a VR game. Not only has the game received a google of improvements, it is now multi-player. This is even more exciting since now Not only is No Man’s Sky the first grammar based universe, it is also the first grammar based immersive experience theme world. While I am assuming it still maintains its procedurally driven origins, it now maintains the three fundamental rules of being an immersive experience theme world. A theme world is any collection of all possible human experiences related to a similar concept yet bound rigidly by one central story, theme or cannon; the theme world heart.

    No Man’s Sky is A Theme World – The entire experience is based on the spirit, of exploration and discovery of the the entire universe.

    No Man’s Sky Is physical. Being shifted to VR projects the third person, over the shoulder experience, into the first person. Equipped with CGI hands, the participant is now empowered to explore the universe from the first person perspective.

    No Man’s Sky Is Collaborative. The gameplay is now multi-player. Participants may now explore and , “do things” in the universe either by themselves or with their buddies.

    No Man’s Sky Is Unrepeatable. The entire universe is open for exploration such that no two world necessarily need to be the same. I am not sure of the exact number but I believe there is not enough time in the history of the universe to explore every one of the world No Man’s Sky creates.

    No Man’s Sky is an at home, or non-location-based experience. However, I am very excited for the game to achieve popular status. I want the audience to develop a taste for massive world discovery. The exploration of an open world with occurring events is the spirit of iMyth and I believe this type of experiential media will gain in popularity.

    As this type of media continues to grow in popularity, iMyth will continue to generate new grammar based universes based in different theme worlds; some familiar and some new. With the iMyth arena, the exploration of these theme worlds can not only happen from the home but in location-based facilities where participants can participate in the fullest immersive sensory experience possible.

  • Boom Town: The Ultimate Immersive Experience?

    Boom Town: The Ultimate Immersive Experience?

    I recently came upon this Facebook article from the UK about, what they claim, the world’s most immersive experience: Boom Town.

    https://www.facebook.com/BuzzFeedUK/videos/2330089977077212/

    Before this posting I have never seen or heard of Boom Town before. It sounds a bit like an English variation of Burning Man mixed in with a music Festival. Music Festivals are wonderful immersive experiences. I love how Boom Town is a theme world integrated with the festival. I wonder if the theme world changes each year they put this festival together.

    As far as iMyth is concerned, I think it would be awesome to empower participants to participate in an immersive theme world, such as Boom Town. With the magic of the internet, participants may choose how they interact with the theme world; actively, semi-actively or passively.

    The most dramatic method would be active. Participants would be required to go to an immersive theme world arena, such as iMyth, put on the immersive equipment and jump into the fun. They would be able to physically act and react and participate with all of the sensory stimuli the theme world has to offer. Folks who participate in this method are the most adventurous and crave the most interactive of all experiences.

    Semi-actively, participants could join in the fun from their mobile devices or from their home computers. They will be able to experience the theme world from personal based HMDs such as Oculus, Vive, or even Magic Leap. The important aspect is that the participants would not be required to participate from an immersive theme world arena. However, they will not be able to Physically participate. They will contribute to the experience as interactors or non-physical participants. Folks who participate at this level may not be able to physically participate in an iMyth arena. They may simply wish to not interact as deeply; metaphorically wading into the water instead of diving in.

    The third option is to participate passively. Restricted once again to a mobile device or home computer, the participant will not be required to employ a visualization device, (HMD), but could view the activities of the theme world from their phone screen or monitor. The view options of this passive perspective are unlimited. However, the amount of interactivity and immersion are also the least. This perspective is for folks who wish to watch on the sidelines and sample the experience before venturing in deeper.

    Festivals such as Boom Town are an inspiring goal of where immersive theme worlds can get to. Since iMyth experience can exist on all three levels of interactivity and immersion, they might actually be able to become something even larger. This of course will need to be explored further once immersive theme worlds start gaining traction.

  • Physical Experience

    Physical Experience

    Wow! It’s been a long time since I’ve contributed to the iMyth Blog site. It’s time to fire up the Furnace and get cooking again! This time it’s not about reporting about another article or another development but to support a concept that has pushed iMyth from the very beginning.

    iMyth is built on three tenets; Physical, Social and unrepeatable. I’m writing this article to support the first and third tenets, phsyicality and non-repeatability. The support for this comes from an article written by Harry Baker for UploadVR, Two-Bit Circus Maze.

    In Harry’s article, he talks about the Maze created by Treyarch/Ubisoft for the indoor amusement park of the future, Two Bit Circus. While this maze is nothing new or revolutionary, I really am drawn to two aspects about the article. The first aspect is that the maze was created with two facades; the mine/maze filled with skeletons and minotaurs and the second filled with iconic Rabids. The experience is basically the same except the experience can be a little more tame or less frightening, depending on the demeanor of the participant. iMyth has always supported an experience that changes dynamically in order to conform to the individual in order to provide them with a “rewarding” immersive experience. This will continue to be something iMyth focuses on. It is confirming to know that a funded immersive experience, Two Bit circus, also stands for the same concept.

    The second tenet, physicality, seems like an unintentional element at Two Bit but really seemed to capture the attention of the article writer, Harry Baker. To quote his article, “For me, I find VR experiences that intersect with physical space and location really interesting. When I’m playing VR, immersion is everything for me. The more immersed I am, the more I enjoy the experience. To be able to walk through a physical space and feel the walls, the wind and feel like I’m in an elevator made the experience notably better. Had I completed the maze in an open-plan room with no walls or physical alterations, it just wouldn’t have been the same.

    I see this inclusion as further conformation that physicality is and will continue to be a significant contribution to the overall quality of an immersive experience. While I myself would not call the sensation ‘immersion’ but rather ‘presence’. Presence, as defined by philosopher Mel Slater, is the willful suspension of disbelief in the presence of an understood pretentious situation. Presence will continue to be an iMyth objective and physicalness will continue to be one of its primary focuses.

  • “Chained” Keeps the iMyth dream Alive!

    “Chained” Keeps the iMyth dream Alive!

    It’s been over two years since iMyth performed it’s last presentation of “The Courier,” and I have not scene a location based theme world experience like it since, until now. The folks at Madison Wells Media have created a location-based, Mixed Reality  amazing interpretation of Charles Dickens’ A Christmas Carol called ChainedI have not been able to participate in Chained myself. Unless MWM Media decides to come to Orlando to give a few performances I don’ think I’l be able to partake. None the less. I understand exactly what they are doing. I found out about the experience through the Verge Article, “Chained mixes virtual reality and live actors to tell a dark Christmas tale.

    Created by Justin Denton, Chained seems to be doing everything correct. The experience takes place on a motion capture stage complete with props and set pieces. Actors and actress from immersive theatre done Motion Capture suites and become interactors in the 20-minute-ish. experience. I have to use the term “around 20 minutes” since each performance is an improvisation and every experience is unique, co-authored with each participant. “Combining a scripted show with on-the-fly moments of improvisation and customization allows the story to remain fixed, while still ensuring each participant’s individual experience will be unique. That approach also extends to the show’s pacing and structure. Rather than having the entire piece run on a timed loop, some individual scenes and transitions are triggered by an on-site stage manager, while others are activated by the way the participant handles certain props. When meeting the Spirit of Christmas Present, for example, I was handed an apple; placing that item on a table in the room triggered the next beat in the scene.”

    Within the experience, bony arms and hands reach out and touch the participant, pulling them into the experience, amplifying their presence. “Chained demonstrates how live performers can allow virtual experiences to become more personalized than they would if an audience member was just watching an automated digital character moving along programmed rails. The actors can change their performance, cadence, and approach based on participants’ behavior.”

    “Chained” is seen as a prototype event, very much like “The Courier.” The experience, which is to be expected, did have its flaws, “As with any production that’s experimenting and pushing boundaries, some moments that work better than others. At one point, I’m pretty sure I nearly stepped on Bates as he tried to secretly crawl away during a scene transition. At other points, it appeared a character was looking down at my chest rather than meeting my eyes.” As the iMyth experience demonstrated that these early immersive experiences do have their mistakes. However, having live interactors does an amazing job anchoring the participant’s presence in spite of the goofs and less than perfect executed sequences.

    The one key factor that “Chained” has that “Courier” didn’t is multiple performances with multiple participants. Running the experience through with hundreds of willing participants really helps iron the wrinkles of a new concept, “When you run lots and lots of audience members through, you really learn a lot, and we’ll make it better and more seamless and more comfortable for people throughout the process.”

    I’m very excited for Chained and I hope it creates a strong media buzz. The more positive attention it bring the more likely iMyth will be able to create more immersive theme world experiences!

    Update: 12/12/18

    I just found this CNET web article about Chained.

    From the web article it is a bit difficult to tell just what the journalist’s perspective of the entire experience is. She does seem impressed however with the feeling of presence and Being There when she knows it is not real.

  • A New Potential in Full Body MOCAP

    A New Potential in Full Body MOCAP

    I just found out today about a new company offering their solution to a full body MOCAP suit. Introducing the new Enflux Full Body MOCAP suit. This suit is interesting because unlike the Perception Neuron with nodes you attach to your body, the Enflux suit has nodes embedded within the fabric of the suit. How the suit deals with the offset in armature scale is an unknown. Hopefully they have solved that small  issue.

    The suit is driven by 10 IMU sensors; five located in the pants and five located in the shirt. Evidently the electronics are easily removed in order to facilitate easy washing of the suit. Each node is rated to plus or minus 2 degrees in roll, pitch and yaw. Their are currently making developers’ suits available to the public for $500. There is also a $100 headband that can be used for head tracking. Currently the technology is available for Blender and Unity. There is no documentation discussing availability for UE4. Overall this looks like a cost effective alternative to the Perception Neuron Suit that can be easily applied and removed, via the suit, and is hygienic, easy to wash.

    Similar to the Perception Neuron, this could be used as a poor man’s MOCAP solution. At $500 less than the Perception Neuron this may seem like a more cost effective solution.

    Regretfully, from an iMyth perspective, I am going to take a back seat on this technology until something a little more second generation arrives. The first and foremost reason is that this is an IMU driven sensing solution. IMUs are great at measuring relative accelerations and displacements. Without a relative world space anchor they have a bad problem of “drifting” away. The drift is caused by an inherit flaw in the electronic’s calculations. As each node iterates over the solution, the amount of drift increases, somewhat randomly, over time. We found a solution to this by using an HTC Vive headset as the anchor point for all character calculations. While not perfect or optimal, it did provide a suitable solution to keep the character in the same relative space. A better solution would be to use a Steam VR tracker at the waist, at the wrists and ankles and on the head. If you are going to this extent, all the suit really offers is economic solutions for the elbows and knees. 2 degrees of float in all of the calculations seems like a heavy priced to pay. That will come across as a lot of float :(.

    They offer a headband with another tracking node in it for $100. This may be great for a non-real time capture performance. However, I’m not quite sure how this would work with an HMD over the user’s head. The Enflux suit also does not provide support for articulated fingers. This is something the Perception Neuron does provide. Understandably this was a design choice in order to keep the cost of the suit down. Will there be some integration with an articulated glove in the future? We’ll just have to wait and see.

    Enflux has a very reasonable entry into the Full Body MOCAP market. Being cheaper than the Perception Neuron may give them the competitive edge they need in order to stay alive. However, being dependent on a pure IMU solution leaves the door open to much better tracking technologies come with the second generation.

  • iMyth MOCAP Suit Test #2

    iMyth MOCAP Suit Test #2

    As promised, here is the second test for the iMyth MOCAP Suit. As a full disclaimer, the system is still very primitive and has far to go yet. But forward progress is being made.

    The system is made with 5 Steam VR controllers mounted on the interactor; one on the waist, two on the wrists and two on the feet. iMyth member Chris Brown devised this first iteration. At this moment in time, there are only positional offsets represented. This is no orientation information yet. That will be the next step. Similarly, there are no scale adjustments made for the differences in scale between the interactor and the avatar. Once those are calibrated with the proper pole vector simulation, the animation will appear much smoother and accurate. There is a certain amount of latency present in the system. We will need to look into that further. Quite possibly translating the blueprints to actual C++ classes will speed things up. However, for these early  experimental stages, blueprints will work just fine. The system is implemented using Steam VR tracking and the UE4 game engine. More really good stuff to come!

  • iMyth MOCAP Suit

    iMyth MOCAP Suit

    Yesterday was a very special day as it marked the first successful Baby test of the new, iMyth MOCAP suit. It was a very simple test but worked well. Jon Albertson was the brave volunteer who doned the new MOCAP suit which consisted simply of a Vive HMD, a MOCAP Belt,  two MOCAP hand trackers and  two MOCAP foot trackers. The belts and the trackers all worked very successfully, driving simple objects within the virtual frame work.

    Although the trackers drove very simple objects in VR, the test was very promising as it set up for the next phase, driving an articulated character. Hopefully we will have updates in the very near future demonstrating the exciting new phase!

  • Greenlight’s VR Industry Revenue Predictions

    Greenlight’s VR Industry Revenue Predictions

    According to Greenlight, more than 65{76c5cb8798b4dc9652375d1c19c86d53c1d1411f4e030dd406aa284e63c21817} of all VR revenue will come from headset sales this year. The anticipated revenue from VR this year is $7.17B. This is expected to grow to $72.82B by the beginning of 2021.

    On a note more related to iMyth, Greenlight forecasts that location-based immersive experiences is going to grow into a significant part of the industry. In 2017, location-based VR will bring in $222M worldwide; by 2021, that amount will grow to almost $1.2B. This is a great place for iMyth to be in.

  • HTC Makes Full Body Tracking Open Source

    HTC Makes Full Body Tracking Open Source

    Yesterday, HTC Vive Senior Engineer James Xiong posted a Vive IK Demo on GitHub, which includes a reference source code and a Unity Demo. Too bad it was not for UE4 but methinks this can be easily transferred. The demo experience employs 3 Steam VR trackers mounted to each ankle and one to the participant’s waist. From this the lower body animation is more or less “guessed”. This has to be an approximation since without understand the pole vector constraints imposed by the knee, it’s really impossible to understand the direction the leg is pointed. However, certain basic cases can be considered and predicted with fairly decent accuracy.

    iMyth was planning on inventing its own solution for this problem. However, having code  to start from will definitely help!