This last week saw the Google I/O and Microsoft Build developer conferences. Neither company made much mention of virtual reality, but both had some news about their augmented reality efforts.
Microsoft Layout & Remote Assist
Microsoft showed off an app called Layout which is meant to help design floorplans in a real world context and then share the results with others. It runs on both Hololens and Windows Mixed Reality headsets. They also demoed Remote Assist, which is the latest version of the remote assistance tech demos they have been showing since the Hololens debuted. Engadget has a good hands-on. Microsoft created a promo video for each app, embedded below, but as usual the marketing department once again conveniently forgot about the Hololens’ tiny field of view.
Remote Assist is interesting because I already use FaceTime for this purpose fairly regularly. Hololens is a nice way to keep your hands free but in theory the application would work fine as an ARCore/ARKit implementation.
Google Maps, Lens, and ARCore Updates
Google showed an experimental feature of Maps that overlays directions and points-of-interest on the camera feed (as well as an animated fox). The features rely on Google’s Visual Positioning Service, which can figure out where the camera is located by recognizing landmarks. Someday soon I have a feeling we’ll finally get a decent guided tour app out of all this.
Google Lens gained better object recognition features. It’s still is probably not reliable enough that people will default to using it when they’re curious about an object, but the features will be more easily accessible from the camera app, Google Assistant or through a double tap on the camera button on certain phones. iOS users can use Lens via the Google Photos app.
You can watch the relevant portion of the keynote here.
ARCore has several cool new features: you can use ARCore directly on iOS now, “cloud anchors” let two phones to look at the same AR objects in the same space, vertical planes are detected, and your app can recognize up to 1,000 “augmented images”, which are like the old Vuforia QR codes but now they can be any arbitrary image as long as it scores at least 75 points on a 0-100 scale of recognizability.
Google announced a new API called SceneForm that is essentially a bare-bones game engine for native Java code to interact with. That will allow developers to add 3D features to their apps without incurring the overhead of Unity or the complexity of writing OpenGL code. In that sense it is similar in scope and purpose to Apple’s SceneKit API.
One development from last week that we didn’t cover was that Google open-sourced Seurat, its “light field” rendering optimization to bring dense scenes to mobile.
Interested in experimenting with Google's Seurat? It's a bit of a pain to compile from scratch so I've provided a precompiled binary for Windows on a fork here: https://t.co/VJOAI4TnyQ
— Dimitri Diakopoulos (@ddiakopoulos) May 6, 2018
The technology was used for Blade Runner: Revelations, which launched on Daydream alongside the Lenovo Mirage Solo. Looking closely at the scene in that device it’s apparent that Seurat’s implementation is not a magic bullet. It is a somewhat low resolution pointcloud approach so if you actually inspect closely an object that has been processed through it, you’ll see it quickly break down into an impressionistic set of floating points.
View from a distance without paying close attention, and it works. The library’s namesake, French painter Georges Seurat, invented pointillism as a form of impressionism; his artwork resolves as you step back. So the naming of the library is on point. The biggest problem I noticed was that occasionally as you move your viewpoint around, part of the objects disappear unexpectedly as if they are suffering some sort of depth-sorting issue. That is a more noticeable flaw than the impressionism of the objects themselves.
Beat Saber Triumphs
VR rhythm game Beat Saber debuted a week or so ago and has already racked up 50,000 sales at $20 a pop, an incredibly fast-selling game by Vive+Rift standards. It deserves it: it’s a thoroughly awesome game and a great workout, although it isn’t as comfortable in a long session as BoxVR. It is however instantly appealing and understandable to just about everybody and is extremely polished, much more so than any competing game at the moment. The tracks are intuitive enough that I’ve been able to get through at least one on the expert difficulty despite the overwhelming (400+) number of cuts that you need to make during a 3 minute track.
A custom track editor is on its way, hopefully by next week. In the meantime some users have already started hacking their own custom beatmaps without it. Others have found more creative ways to increase the challenge. The most spectacular of these is the “Darth Maul mod” where the two motion controllers are connected to create a double-sided lightsaber.
There’s also “trapeze mode”:
I’m finally getting the hang of @BeatSaber – Trapeze Mode on the #OculusRift #VR pic.twitter.com/6EtZYiam5v
— Matthew Harris (@HatthewMarris) May 12, 2018
Mirage Solo Hidden Menu Unlocks Movement
My biggest complaint about the Lenovo Mirage Solo headset I tried last week was that it traps you in a very small bubble of movement. Get to close to the edge, and you end up enveloped in a gray void. Apparently there is a hidden developer menu that allows this feature to be deactivated. Ian Hamilton tried it out in his rural backyard, wandering around the wilds of the Daydream home environment and came away extremely impressed. He also found that the Mirage Solo will happily run standard Android apps in a floating window, enabling any media app to be converted into a virtual theater of sorts.
The Oculus Go is extremely comfortable, but YouTuber Sebastian Ang wondered if it could be made even better. So he replaced the strap with a Vive Deluxe Audio headmount.