You could call it “Logical progression” after my first steps into VR with cardboard, it was time to try the Apple ARkit. Again like the cardboard experience, a low-cost XR experience on mobile.
The basic requirement for any AR experience is the ability to create and track a correspondence between the real-world space and a virtual space where you can model visual content. When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world.
Augmentation is not only visual, what about sound?
Time to try this out. First, I wanted to find out the basics like placing an object with real-world dimensions. (video 1) Second, find out how we can use sound in AR. (video 2) Third, combining it all, visual, audio and location (ARKit + GPS data) (images) and the last video is an experiment using ARKit without the visual camera feedback, controlling sound by moving the device.
First impressions:
– ARKit develops a better understanding of the scene if the device is moving, even if the device moves only subtle.
– Allow time for plane detection to produce clear results.