Studio Transcendent’s exciting virtual airshow, Rapid Fire: a brief history of flight has been added to Inception VR’s streaming service in its 360 video format. Witness the Wright Flyer’s first leap, dodge bullets from a WWI fighter, get buzzed by the leading-edge F22 Raptor fighter jet, and be nearly engulfed by the world’s largest plane. You will find it under the “New” category in the Inception app. If you have a Rift, we highly recommend checking out the real-time rendered version, free on the Oculus Store.
Here in Los Angeles we’re riding out an unusually dangerous Santa Ana wind event. Today will bring the most hazardous winds, and they should die down tomorrow. Here is an amazing video showing a fire than sprang up along the 405 between UCLA and the Getty Center on Wednesday morning. Everyone stay safe out there. Our thoughts are with you if you are in an evacuation zone.
John, Aaron, Bowdy, and Elissa
at Studio Transcendent
Oculus Core 2.0 Beta Goes Live
Oculus has unleashed their new system software as a public beta. The desktop app has been revamped with a new dark theme and nicer animations. The VR interface is now persistent and comes up as an overlay over the top of whatever your current application is. There is a new VR Home, which still launches as a standalone app. It offers you a choice of three pleasant environments and a private room in which you can modify the textures and place furniture. You can also “earn” cubes that contain additional decorations, which appear to function like loot boxes. You get one for completing the tutorial.
To get ahold of the new beta, go to Settings -> Beta in the Oculus app and toggle the button for Public Test Channel to On. You should see a notification that a new version is being downloaded, and sometime later it will prompt you to restart the Oculus app. You probably will also need to update your graphics driver.
One of the biggest new features is Oculus Desktop, which provides a virtual desktop inside of VR for you to interact with the standard Windows shell without leaving VR. Oculus promises that you will be able to tear off any window and pin it to stay visible while you are using an app. In the current beta, this is accomplished by pointing at the menu bar and “grabbing” with the palm trigger. In the beta, this is still pretty rough, not always recognizing when you want to tear off a window, and the torn off panel is often blank. Once a window is torn off, it remains mirrored on the desktop.
The mouse cursor is not visible in the beta, but an easy workaround is to go to the Windows Mouse Control Panel and turn on Cursor Trails with the minimum length setting. However, the mouse is still not visible on torn off windows, only on the main desktop, and you have to use the Touch controller to make the torn off windows active. Clearly Oculus had a “Touch first” mindset when working on the feature, trying to make all the interactions work well entirely through the Touch controllers (you press and hold to emulate right-click).
This probably makes sense, because, while the virtual display is surprisingly readable, it still is not pleasant to use. There is a lot of aliasing, and we found it distracting enough to render a page of code unreadable. A higher resolution headset is needed to make it viable for most people. However, it is still a boon for developers as it enables a quick and easy way to switch between—for example—the Unity UI and the running Game Window to tweak settings and look at the Console.
All in all, it is a teaser for the future we all know is coming, when we will spend all day in VR. The software is almost ready; the hardware is not.
Varjo Announces “Alpha Prototype” and More Funding
One company that is addressing the resolution issue is Varjo. They have designed a headset that uses a tiny ultra-high-pixels-per-inch micro OLED display (like you might see in a camera viewfinder) combined with a larger cell-phone type screen behind it. The image provided by the smaller display can be moved around to where the eyes are pointing. This results in a form of foveated rendering, where most of the information in the image is displayed in the region where the eye can make out the most detail and the surrounding area provides “context”. Your brain should then fill in that low-detail area with high resolution imagery as your eyes flit around the field of view in small movements called saccades, taking in a small portion of the overall image in each split second.
The end result, if the technology can be perfected, would be a display in which the pixels are not visible and the so-called “screendoor effect” is eliminated, as we now enjoy with modern smartphones and 4K laptops in the real world.
Varjo originally demonstrated the technology in a modified Oculus Rift. They have now announced a devkit (and $6.7 million in new funding). The devkit, called the “Alpha Prototype”, features position tracking based on Valve’s Lighthouse technology. Varjo refers to the small OLED as a “Bionic display”. The larger “Context display” is capable of rendering content at 90hz, and the eye tracker can track saccades at 100hz. You can apply to get one here.
Google Research Blog Post on Foveated Rendering
If you would like to know even more about foveated rendering, the Google Research blog has an interesting post detailing a new rendering pipeline that has several images illustrating the concept. They discuss solutions to two problems that are caused by foveated rendering. The first is that the low resolution peripheral image may display obvious aliasing because the reduced resolution means the pixels are bigger. The second is the inefficiency of pushing an image to a very high resolution display—even if you can render it with foveated rendering, in a conventional setup you would still need to upscale that image to the same size as the display resolution. The solution could be to have the display module combine the context and foveal images so that only the relatively low resolution source images have to be sent across the pipe from the GPU to the display module.
High Fidelity Taps Into The Power of the Blockchain
Decentralization has long been at the core of the design of High Fidelity, Philip Rosedale’s follow up to Second Life, so it should be no surprise that when it came to adding a digital currency to monetize its user-generated economy, Rosedale turned to the blockchain. (High Fidelity is not to be confused with Sansar VR, which was created by Linden Lab, which still controls Second Life, but no longer employs Rosedale).
High Fidelity is introducing its own cryptocurrency for the time being, because of high transaction fees arising on Ethereum and Bitcoin networks due to rampant speculation that pushed Bitcoins up to $14,000 per coin this week, at least temporarily turning the Winkelvoss twins into billionaires.
However, the design should be portable beyond the confines of the High Fidelitymetaverse, even allowing for real world transactions. It also enforces the permanence of metadata related to digital items, which has some cool use cases; for example, an item could be restricted from entering part of the virtual world if it did not have the right metadata, allowing content owners to enforce thematic coherence while allowing avatars to bring in paraphernalia purchased elsewhere.
The new system is debuting on “Avatar Island”, a virtual outdoor mall featuring an oddly old school shopping format that includes storefronts with checkout counters and, at least in the video, a salesperson. Users will have to contend with some cryptocurrency terminology, as they will need to manage their “private keys”, and wait for transactions to clear on the “blockchain ledger”, so perhaps keeping the rest of the process aggressively familiar is the right strategy.
Doom VFR finally brings the Doom universe officially to VR (you may remember that Doom 3 was used for the very first demo of the Oculus Rift; John Carmack created the port and that is partially responsible for the long running lawsuit between Oculus and id’s parent company, Zenimax).
Home from the BBC lets you take a spacewalk from the International Space Station for free. It is similar to the spacewalk in Mission: ISS, but the controls are a bit janky. It wants to guide you on rails through the spacewalk but also makes a move halfway towards the handhold-driven locomotion system used in Mission:ISS, Lone Echo, and Echo Arena. In theory, that is a terrific idea, but in practice it is not comfortable. It made us sick within seconds, whereas we find the other three games comfortable enough for extended sessions.