There are two bits of information out there giving some direction of where HTC and Steam will be taking with the Vive.
In fact, I know this may be too early guessing, maybe the next generation of HMD coming from this collaboration won’t even be called the ‘Vive‘ any longer. What are your thoughts?
On first order, the engineers at SteamVR reminded all future developers to start ordering their new, STeamVR 2.0, base stations. The new base stations will not be compatible with the old HMDs. These new base stations will only be compatible with the new TS4231 sensors. Good for backwards compatibility, these new sensors will still respond with the old lighthouse base stations. These new base stations will be cheaper, have no moving parts and will not have sync issues. Steam is asking manufacturers to start making orders now. The manufactures must buy them in bulks of 45 at $60 a piece and supply no packaging and no mounting equipment. The retail price of the new base stations will probably be more expensive than $60 but we’ll just have to wait for the MSRP in 2018.
What is really exciting about these new bases is that they will soon be able to support up to four base stations working in conjunction with each other, covering volumes of up to 10 cubic meters. That is really big! In fact it is so big that that should be a sufficient enough space to implement redirected walking seamlessly without resetting. Of course there would be caveats in the environment to compensate for the limited space. However with a 10mX10m space you should only have to worry about a reset ever 13m which is still quite a large distance! This is super exciting and more information as things continue to develop.
The next bit of information invites even more room for conjecture. HTC has just applied for a New Zealand patent for a new HMD called the HTC Eclipse. The HTC Focus was thought to be the new, wireless mobile headset to be compatible with the new Windows 10 VR suite. The new HTC Eclipse has these particular tags: head mounted display for computer simulated reality, motion tracking sensors, handheld computer simulated reality controllers. ” Is this an indication of the next generation of VR? Time will only tell. However, the simultaneous release of the next generation of tracking and this new HMD may be more than coincidence.
Yesterday, HTC Vive Senior Engineer James Xiong posted a Vive IK Demo on GitHub, which includes a reference source code and a Unity Demo. Too bad it was not for UE4 but methinks this can be easily transferred. The demo experience employs 3 Steam VR trackers mounted to each ankle and one to the participant’s waist. From this the lower body animation is more or less “guessed”. This has to be an approximation since without understand the pole vector constraints imposed by the knee, it’s really impossible to understand the direction the leg is pointed. However, certain basic cases can be considered and predicted with fairly decent accuracy.
iMyth was planning on inventing its own solution for this problem. However, having code to start from will definitely help!
A couple of months ago, iMyth created a rough prototype of its immersive experience. One of iMyth’s key components is physical props and sets. We integrated very inexpensive props and sets into our experience. While not the most sturdy these set pieces did an outstanding job demonstrating the physically immersive concept.
Bandai Namco has taken this concept one step further and created an experience based on the Doraemon Anywhere Door theme world. Using the HTC Vive, Leap Motion, and a few simple props tracked with attached Vive controllers, the team was able to create two very interesting interactive props.
https://gfycat.com/SpiffyDisgustingHowlermonkey
The first of these props is the door. Using a very simple props door placed cleverly inside the Vive play space in order to avoid loosing tracking. The use Leap Motion to track the participant’s hands which of course frees up the Vive Controllers. One of the controllers is placed on the edge of the door and is used to track the door opening and closing. It is a simple concept but the physical component is extraordinarily impact-full. What I really want to know is where did they get the great door prop? Notice that the prop doesn’t have a footprint larger that what it would have in real life. How did they anchor it? It looks solid.
https://gfycat.com/ImpishCautiousGermanspitz
The second object is a simple desk. Once again, all the creators did was attach a second controller to the drawer of the desk. The desk itself is stationary and never moves. Once again this is very effective use of a simple concept.
I just place my order to HTC for 4 tracking “pucks”. We should get them by the end of the week. The pucks are going to be used to help track the interactor. However, I did forget that for every Vive there are two controllers. That means iMyth will now have 8 tracked objects to deal with. Where can we go from here?
IMR is at the forefront of wireless Virtual Reality data streaming. Our proprietary algorithms and hardware architecture, produce UN-paralleled results.
At IMR we have a mission to create the world’s first compression standard for VR content. We are a technology company founded by leading aerospace, computer vision and robotics experts. We have developed an algorithm and hardware that enables Wireless transmission and streaming of VR video over 802.11ac Wi-Fi and 802.11ad WiGig standards. Transforming the VR industry and empowering a completely immersed and untethered experience for multiple players.
Immersive Robotics (IMR) has developed a new compression standard for VR content. The proprietary algorithm and electronics hardware enables wireless transmission and streaming of Virtual Reality (VR) video over 802.11ac Wi-Fi (5GHz) & the latest 802.11ad WiGig (60GHz) standards. The following is the description of its potency:
Rapid Data Transmission
With a 95{76c5cb8798b4dc9652375d1c19c86d53c1d1411f4e030dd406aa284e63c21817} compression rate, IMR’s technology allows for compression and decompression with a record breaking introduced latency of less than 1 ms. This translates to zero perceived latency by the player, increasing user comfort and the elimination of motion sickness caused by latency within VR play.
Image Quality
The quality of the decompressed image is indiscernible from the original with no motion blur or introduced artifacts.
Eye Tracking
IMR’s algorithm leverages a suite of highly “VR-optimized” techniques to reduce required bandwidth and operate at an extremely low latency. One option feature is an input for eye tracking data which allows for further dynamic control and greater compression efficiency.
Versatile
IMR’s technology utilizes both the 802.11ac and 802.11ad wireless standards. This enables current generation HMDs to be supported via the AC standard, and future proofs the technology by enabling it to handle up to 2x 4K VR video transmission over the AD standard.
Our technology is designed to operate across all VR and telepresence robotics applications and each has their own requirements for the wireless. Our technology provides the necessary compression/decompression at ultra low latencies for ALL these applications, and we are working with and looking to partner with different wireless manufacturers and communication link suppliers to push this technology into each area.
KwikVR’s unique advantage over other wireless competitors is hard to tell, because we have not been able to test our competitors’ solutions. They are all claiming an impossible one or two millisecond latency overhead, so I would say our main advantage is to be honest. Also, our solution does not use 60GHz Wi-fi at the top of the head of the user, which might be better for health reasons. Using 5GHz Wi-fi is also less prone to obstruction issues when it comes to the Wi-fi signal. We believe that our latency overhead is close to optimal, but only the customers will be the judges.
I think you can classify Wireless VR into what type of radio it uses and what type of compression. Of course all systems have to deliver under a frame of round trip latency.
Various Radio Types:
WiFi 802.11ac 5GHHz & 2.4GHz
WiFi 802.11ad 60GHz
5G LTE cellular for cloud VR (various frequencies)
Proprietary radio in unlicensed frequency (e.g. 5GHz)
Our solution uses WiFi 802.11ac and LTE. This has the benefits of not needing line of sight transmission. 60GHz transmission suffers from large attenuation when propagating through physical barriers including humans. 802.11ac can travel much longer distance than 60GHz and provide multiple room coverage. 802.11ac is also much cheaper and requires much smaller wireless antennas than 60Ghz. Placement of the transmitter is not important with 802.11ac unlike 60GHz. 802.11ac is also lower power giving longer battery life of the HMD.
Various Compression Types
JPEG (Intra frame) with 3:1 compression
JPEG 2000 (Intra frame) with 6:1 compression
MPEG H.264 (Intra and Inter frame) 100:1 compression
MPEG H.265 (Intra and Inter frame) 200:1 compression
Proprietary Compression
Our solution uses MPEG H.265/HEVC compression which provides 200:1 compression. E.g. a source of 1080p60 requires 3,000 Mbps to transmit uncompressed. We compress this to 15 Mbps a compression ratio of 200:1. This allows headroom for error correction and higher resolutions and frame rates as well as data rates that can be delivered from the cloud over 5G LTE and fibre networks. Standards based systems also allow off the shelf mobile chipsets to be used to build into mobile HMDs. We will adopt future H.265 profiles which can provide even better compression using tools like multi view and screen content coding tools.
While other vendors are focused on bringing wireless accessories to today’s HMDs, Nitero is the only company developing an integratable solution that will support the aggressive requirements of future VR HMDs.
The solution’s novel micro-second latency compression engine provides royalty-free, visually lossless encoding, adding end-to-end latency of one millisecond. At power below one Watt, it can be integrated into future headsets without the need for expensive heat sinks or vents. In fact, adding Nitero’s wireless solution will be significantly less expensive than cables, resulting in an overall cost reduction, which is critical for VR adoption going forward.
Interoperable with WiGig, Nitero has customized for the unique challenges in the VR/AR use cases with advanced beam-forming that supports NLOS at room-scale. Additionally, back-channel support for computer vision, eye-tracking, 3D-audio and other forthcoming technologies can be supported simultaneously with the VR display, without needing another chipset.
Some of the industry leaders that have supported Nitero via investment and collaboration include Valve Software, Super Ventures, and the Colopl VR Fund, along with others not publicly announced.
We use a combination of video compression and proprietary streaming protocol that allows us to stream high resolutions to multiple headsets. Our solution is designed primarily for Theme Parks and Arcades that want to put two or more people in the same tracked space.
Our thesis is that in the future you will always need some amount of compression, either when resolutions get higher (4K and above. We need 16K for retina resolution), or if you try to put the server outside the local network. Ideally, you could put a GPU farm in the cloud and have all the content available immediately thus even eliminating the need of a PC at home! I think that in five years the only computer you would need at home would be a small mobile chip, probably built into the headset itself.
Of course, any sort of compression introduces latency. However, there’s been a lot of development in the past two years to go around that. We’ll be releasing a network aware technology similar to Spacewarp that’s used by Oculus. And companies like Microsoft have done a lot of research on reducing latency by doing predictive (also known as speculative) rendering. Project Irides, for example, is able to compensate for 120 ms of network latency in their demo. We’ve been talking to one of the lead researchers of Irides for a while, and we’ll release similar technology in 2017. So I would say that the future of wireless VR is very bright!
HTC/Intel Aliance
WiGig (Intel’s chosen solution) is, as the name suggests, a wireless multi-gigabit networking standard which dramatically increases over-the-air bandwidth over standard WiFi over short distances (the same room). In actual fact, the name ‘WiGig’ is a shortening of the organisation (Wireless Gigabit Alliance) which helped define the IEEE 802.11ad 60GHz standard. WiGig is aimed at very high bandwidth data uses, such as the broadcast of multi-gigabit uncompressed video and audio streams. Although its uses are more limited (short range, doesn’t work well through walls) it is ultimately a very high speed general purpose network standard in the same way as other WiFi standards. Bottom line, if you buy an 802.11ad compatible router, it’ll not only be backwards compatible with your older devices, you’ll be able to use that extra bandwidth for any sort of data transfer, not just video and audio. WiGig data rates max out at 7 gigabits per second per channel.
TPCast
60GHz wireless technology is being investigated by a number of companies for use in wireless VR applications, including one that Valve invested in separately from HTC. While the frequency provides lots of bandwidth, it isn’t great at penetrating surfaces, meaning that it’s most effective when the transmitter has direct line-of-site to the receiver. The TPCAST wireless Vive kit has the transmitter mounted on the user’s head to give it a direct view to the receiver, but there’s certainly times during room-scale VR play where the user may be turned away from the receiver with their head tilted at an angle that would break line of sight; the player’s hands could also get in the way, though it isn’t clear yet how these situations might impact the TPCAST device’s performance, mostly because we don’t know the recommended setup for the system which could possibly use multiple receivers or recommend a special placement to prevent transmission issues.
I have been a bit lax on the blog updates for the last few months.
Never fear, the concepts for iMyth continue to move forward with new ideas, new partners and new innovations!
Keep in contact for information concerning iMyth’s next MVP, (I’ll bet you didn’t even know we had a first :)).
However, supporting the next MVP is this Technology from HTC and Valve. Part of our first MVP involved using simple hand gestures within a VR experience. We generated only mediocre results. Mind you did not allow ourselves much time to fine tweak the experience. Given a weak or so of tuning time I’m sure we could generate something a bit more palatable. The HTC Valve device doesn’t look that much better but who knows, this integrated solution may be all that is needed.
I got the article from an upload VR article. They don’t seem very impressed either. However, after the short experience with the Leap Motion I’m will to give almost anything a try.