February 28, 2018
The Mobile World Congress this year had a number of cool announcements including the new Samsung S9 and S9+, Huawei unveiled their 5G chip allowing for even faster mobile internet but the most exciting reveal, or launch in fact, was Google's launching ARCore 1.0.
What is it?
The ARCore is a brand new augmented reality SDK for Android from Google. It provides developers the tools to create augmented reality apps and publish them to the Google Play store.
ARCore currently only works on a select few Android smart phones that run on Android 7.0 or later. These include the Google Pixel 1 + 2 and the Samsung galaxy S7 and S8, although more will be supported as development on the SDK continues.
What does it do?
ARCore utilises three main technologies to allow digital content to interact with the real world. These are motion tracking, environmental understanding and light estimation. We have already covered how ARCore uses 6 degrees of freedom for VR applications, you can find that here .
The motion tracking of ARCore uses the camera to detect distinct parts of an image turning these into what are called Feature Points, it couples these with measurements dictating the orientation of the camera with respect to the world.
This allows apps to create the augmented reality experience by overlaying digital objects on images captured by the camera, making them appear as if they exist in our environment.
This could be used in apps that allow users to create and manipulate 3D objects so you could view them from multiple angles while editing them in real time.
ARCore uses those Feature Points mentioned earlier to relay information about the detected surfaces to your app so it can use these ‘planes’ to overlay its virtual assets on them.
Imagine a meeting where you don’t need to sit through a sleep inducing Powerpoint when you could see and interact with visual representations of the information on the desk in front of you. That’s what ARCore developers can do for you.
ARCore estimates the average light intensity of any given image which allows your apps to use the same lighting on your digital objects placed in the scene.
Light estimation would allow developers to create apps that further incorporate the users environment into the augmented reality experience. Model viewers that are dynamically lit or even a game where day and night cycles are tracked by real light are just a few examples of how this could be used in development.
Anchor Points and Trackables
New to ARCore 1.0, anchor points allow ARCore to track the position of an object over time. Anchors can be based on planes which are trackable, meaning that they remain stable if the position of the camera changes.
Anchors and trakcables used in conjunction would stop your 3D models and game avatars from falling off of their AR perches if you had to move.
Google has gone to great lengths to put AR into the hands of everyone. ARCore’s release could be the start of something big in AR. At Pocket Sized Hands we are very excited to see what this means for the future of Mixed Reality technologies.
If you have an idea you want created get in touch with us and we can bring it to life. Whether it’s a quick prototype, a full product or even just to brainstorm, our team will be here to help. From the beginnings of your prototype to the final product we can support you with professional advice and delivery.