It’s hard to know for certain if the hype surrounding augmented reality is warranted yet. But I’m banking on it picking up some real steam this year!
The Current State of Mobile AR
Augmented reality has been hanging out on the Gartner Hype Cycle for quite some time.
With LAYAR first making a splash way back in 2009, I remember being blown away by it on my iPhone 3Gs in 2010. But that excitement wore off as the experience fell short. The overlays were wildly inaccurate, requiring you to use your imagination to fill in the gaps in the experience.
The only augmented reality products that are spoken of seriously are steeped in mystery and start-up hype (think Magic Leap).
But after downloading and installing the latest version of Apple’s iOS operating system last week, I’ve renewed my hope and enthusiasm for what AR is about to be able to offer consumers and marketers alike in the coming year.
The History of AR on Mobile Phones
Ten years ago, the idea of mobile AR was sound: augmenting reality would allow a user to hold a phone up to the world, and see a layer of useful data on top of it. The phones positional awareness and data connection meant one could overlay whatever data set they chose to enhance their understanding of the world around them. It’s promise was a dynamic situational awareness that tuned itself to the users’ needs at any time and place.
But it didn’t exactly happen that way.
The problem wasn’t the data set. It was a hardware issue! The iPhone 3Gs 3.2mp camera wasn’t very good. And the GPS sensor and compass weren’t that accurate (phones didn’t even give turn-by-turn driving direction back then). And the issue of filtering the data added another cumbersome complexity. Having access to every piece of information is actually crippling. We needed smart curation of the right content, at the right moment.
Fast forward a bit from that old not-so-augmented reality to 2013, when Apple acquired PrimeSense (the company that built the technology powering the Xbox Kinect gaming peripheral). For me, that’s where things started to get interesting. Apple also made an acquisition in 2015 of an Israeli startup called LinX. These, and other strategic acquisitions pointed towards a vision that Apple had of a future where phones were much more aware of their surroundings.
The Kinect was marketed as a device for controlling video games with your body. But it sort of failed as that. In fact, Kinect was so lauded by the public as being an awkward gimmick, that it was removed from Microsoft’s flagship Xbox One console shortly after its initial release. But what it did do, was give computers and developers a remarkably data rich and detailed picture of the human world. Computers were suddenly able to see and hear the world in a way that they hadn’t been able to before, and developers created all sorts of novel experimental art installations and interactive marketing projects. The Klick Labs team even used a Kinect unit to power an interactive model of a 3D beating heart that was linked to a live heart monitor.
So now, in 2017 we’re seeing a collision of the explosion in the power of mobile hardware, and the maturation of environment sensing computer vision software.
That’s where Apple’s ARKit comes into play. While the hardware that went into the Kinect has been remarkably shrunk down into the front facing notch of the iPhone X, it’s the software tweaks that have been made to the rear facing camera on all iOS devices that make it so exciting.
Anyone who’s installed iOS 11 onto their 2015 or newer iDevice this past week may have had the joy of experimenting with an ARKit enabled app, such as IKEA’s Place app. Gone are the shaky faux AR overlays of the past (fans of Pokémon GO know what I’m talking about!). Enter the augmented reality revolution!
When I saw ARKit teased at the WWDC this past June, my eyes and heart lit up with excitement. The demo they showed on stage was nothing short of mind blowing. To the average viewer it wasn’t much different than Pokémon GO (Some animated virtual objects appeared overlaid on top of the real world), but to me it was pure magic.
See, when you move your phone in Pokémon GO, the animated character movies WITH YOU. It’s not actually virtually bound to any real-world object, surface, or geometry. It’s simply positionally static overlay, locked to the phone by an invisible digital tether. With ARKit, Apple has managed to program a positional and spatial awareness (they call it scene processing) into their phones like we’ve never seen before. This kind of true digital overlays onto real-world environments had previously only been the purview of cutting edge, expensive, and not very accessible technologies like the Microsoft Holo Lens, or the aforementioned Magic Leap. Large headgear, steeped in mystery with multi-million dollar seed funding rounds, and almost no installed user base. To me, ARKit feels like a revolutionary wave because of one word: access. 90% of iOS devices are going to learn this new magic trick in the coming weeks as the iOS 11 install base grows and grows.
Augmented Reality isn’t Foreign to Klick.
Klick Labs has deployed some pretty cutting edge augmented reality experiences on iPhone and Android devices already. Rich, immersive AR experiences that link to specific optical glifs. In the case of this demo, we use a magazine cover as the trackable object. And the effect is pretty breathtaking! But it is admittedly short lived and limited in scale entirely by the presence of the magazine cover (our Labs team developed this long before ARKit was even a breath in the wind!).
Most AR worked this way before, in that the physical world had to have an object placed in it before the AR effect could be over layed. The Nintendo 3Ds and it’s 3D sensing dual camera used cards with QR Codes on them to pull off its AR tricks.
The Klick Labs experiment and the Nintendo 3DS experiences were both true augmented reality, but the reality they lived in was limited.
Whereas #ARKit, promises to augment every reality and does it all from within your phone. ARKit is the name Apple has given to it’s system-wide API and tools for developing augmented reality content. And some folks have gotten their hands on the beta and created some pretty awe inspiring content. From virtual pets, to incredible 3D data visualizations, to portals to different worlds, ARKit has inspired a new wave of interaction designers to creating immersive and engaging content.
And it gives me tremendous hope for being able to turn this technology into something extremely useful for people and patients alike. Imagine a highly precise turn by turn walking directions app to use in a complicated hospital. Or accessibility options for people with disabilities. When your phone is capable of seeing the world with such awareness and precision, your ability to interact with the world in new and novel ways is expanded.
Much like having cameras in our pockets 24-7 shook up the photography industry, so will having a digital data layer over top of our worlds change how we interact with it. Take this early ARKit enabled example from IKEA, which uses its advanced room tracking abilities to place virtual instances of IKEA furniture into a room effortlessly. Objects appear to really be sitting on the floor, and you can walk around them while they stay magically in place. All without the need to place any kind of identifying markers on the floor for the phone to track.
Now that AR is really here, what’s next?
Over the past decade, Augmented Reality has suffered to gain tracking for the same reason QR Codes have: a lack of system level integration made the friction to adopting them too high. But now, with AR built directly into both iOS and Android operating systems (Android announced ARcore as their me too solution shortly after Apple) forthcoming developers will be able to easily integrate this enhanced data layer into their projects.
Think: a hearing aid app which syncs with your hearing aid just by pointing your phone camera at it. OR an instructional guide for a medical device that layers over highly detailed & realistic animations of how the device is to be used. Or a how-to experience for administration of an injection that plays out over a view of your real skin. The possibilities for patient instruction are endless! And for marketing: imagine being able to make information come to life over top of your product that could make it leap off the shelf of a pharmacy. Or being able to change the language on your packaging virtually for localization and accessibility. Or how about a hospital being able to give people step-by-step, turn by turn directions through a complicated building to get them to their scheduled appointments on time? Or customized signage that’s tailored directly to a patient as they walk through a clinic?
Multi-camera, world sensing super pocket computers are here now, and more than just a trend. Look for augmented reality to be sitting much higher on that emerging gartner hype-cycle curve this time next year.
The talented developers, behavioural scientists, experience designers and medical visualization specialists at Klick have already mastered these technologies, and are ready to demystify them and put them to work for you.
Now if you’ll excuse me, I’m off to convince my wife that we need to update the furniture in our livingroom by floating magic virtual versions over top of our old ones.