How do you create realistic augmented reality experiences on non specialized devices, like an advanced smartphone? Say you wanted to overlay data on top of an operating machine so it appeared as part of a 3D AR experience? We experimented with something like this using a head-set. So you could look at a part of a machine and see data, warnings, blue prints, predictions... But what if you wanted to do the same thing using a modern smart-phone? And just point the phone at parts of the machine and get a realistically designed 3D data display you could navigate?
Lots below include video of how the approaches can be used, and technical background.
Blending Realities with the ARCore Depth API Google Dev
Monday, December 9, 2019
Posted by Shahram Izadi, Google Director of Research and Engineering
ARCore, our developer platform for building augmented reality (AR) experiences, allows your devices to display content immersively in the context of the world around us-- making them instantly accessible and useful.
Earlier this year, we introduced Environmental HDR, which brings real world lighting to AR objects and scenes, enhancing immersion with more realistic reflections, shadows, and lighting. Today, we're opening a call for collaborators to try another tool that helps improve immersion with the new Depth API in ARCore, enabling experiences that are vastly more natural, interactive, and helpful.
The ARCore Depth API allows developers to use our depth-from-motion algorithms to create a depth map using a single RGB camera. The depth map is created by taking multiple images from different angles and comparing them as you move your phone to estimate the distance to every pixel. ... "
Monday, December 16, 2019
Blending AR Realities for 3D Interaction
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment