Apple ARKit and GPS data
The basic requirement for any AR experience, is the ability to create and track a correspondence between the real-world space and a virtual space where you can model visual content. When your app displays that content together with a live camera image, the user experiences augmented reality: the illusion that your virtual content is part of the real world.
Augmentation is not only visual, what about sound?
Time to try this out. First i wanted to find out the basics like placing an object with real world dimensions. (video 1) Second, find out how we can use sound in AR. (video 2) Third, combining it all, visual, audio and location (ARKit + GPS data) (images) and the last video is an experiment using ARKit without the visual camera feedback, controlling sound by moving the device.
– ARKit develops a better understanding of the scene if the device is moving, even if the device moves only subtle.
– Allow time for plane detection to produce clear results.
During the test process i came up the idea to combine the local ARKit tracking with GPS wayfinding. This resulted in a rough working version of a “Pokémon Go” type of experience.
So i came up with a testcase app with the following objectives:
– play specific peaces of music at a specific locations in the real world
– visualise “the presence of sound”
– show it on a map
To create the maps i used the developer tools from Mapzen. Unfortunately by the time i’m writing this post, Mapzen is shuttingdown it’s services.
You can see the outcome of these test below.
Digital content & experiences can become scarce or exclusive again, unable to be infinitely copied and possibly more valuable as a result, as it can only be experienced in a physical location (and maybe at a single time).
Similar to the way Live concerts are more valuable than recordings.
This motivates me to keep the creation proces going, pushing the boundaries of music culture.
First tests with AR kit:
Drum Machine on desk
Tests AR kit and GPS Data on iPhone 6s:
Experiment at Middelheim park and random location in Antwerp.
Specific sound at specific location, and how do we visualise this?
Out of the box: using ARKit without camera feedback. Conclusion: the Google SDK does a better job for this.