An Oldy but Goody, But Now with Augmented Reality
Eight years ago I prepared a multi-channel cross-media demo and presentation that still has a lot of relevance today. It turns out that Augmented Reality (AR) on mobile devices is a new and enticing way to present and experience that same content!
Way back in 2010 (jezz time flies), while an original partner at Trekk, I was invited to present at a preminiant XML conference. This coincided with our exploration of eBooks as a distribution channel for cross-media communications.
The presentation I prepared was Using XML to Create Once — Distribute Everywhere Delivered . I delivered at the IDEAlliance XML 2010 eMedia Revolutions conference in Philadelphia. Why is this relevant? Several reasons…
- The subtitle was Structured content creation and the future of Cross-Media publishing of cross-media content distribution and extols many of the architecture principles I still promote today
- The technology demonstration presented is an eBook about cross-media content workflows, including eBooks
- These same workflows principles also apply to newer channels such as Augmented Reality
In this preso I showed an eBook produced to exemplify AND embody the process. The process defines the separation of structure, style and behavior of an application with separate HTML / CSS / JS files.
Asset file contents or URL references are the payload in RSS containers. The RSS/XML container obscures or abstracts the content payload. This is analogous to standardized shipping containers that revolutionized international trade.
Shipping companies ‘don’t care’ what is inside, the only concern is efficient transport. The shipping customers care deeply about what is in the container, but once it leaves the loading dock, they ‘don’t care’ about the details of transforporation until it arrive at the destination loading dock.
Similarly a tech stack playing the part of an international shipping provider has little concern about the application data that is in the payload during network transport. On the other hand, the User Experience app receiving the cargo cares deeply about the HTML / CSS / JS that is inside.
I am not the only one that has used this type of abstraction and standardized transport. There have been several workflow orchestration platform that use RSS as standardized transport. In fact, Google Apps used of RSS, and a variation called Atom , for this purpose.
The payload and transport of content via standards is the topic of the eBook called Trekk Cross-Media Series: Using XML to Create Once — Distribute Everywhere
My framework allows content to be defined as structured HTML The UI style and behavior are defined by separate CSS / JS file appropriate for different channels such a web browsers, mobile devices, PDF documents and eBook formats MOBI for Kindle and ePub for iBook.
To download and view these versions use the following links. You may need some additional configuration for these to open in specific eBook apps depending on your mobile device and OS version.
A web search will also include links to Google Book version: https://books.google.com/books?id=iAXRKFS6O2oC
The RSS version is a definition for an iTunes podcast that can be used to access the Audio recording of guess who? Me!
Each of the eBook pages includes a link to the audio file assets. For instance, the Title page includes the link Audio
The interactive nature of the content produced in 2010 can now be repurposed for a new experience that is rapidly become a major interactive channel. Designers are just beginning to envision how VR and AR can deliver not only interactive, but deeply immersive user experiences with multimedia content assets.
To experience a simple AR version of my eBook content, install the new, incredibly cool, and innovative
platform app from Apple Store or Google Play for your mobile device.
Using the platform I have configured images and audio content from pages of the eBook as experiences that will play when a specific marker is scanned. A marker is any unique image with sufficient contrast that can be recognized by the platform and associated with a unique experience/scene.
Once you install and run the app you can see the splash screen where you can select SCAN:
You can print out the following marker image to scan with your device camera, or scan the image on this screen directly:
When scanned and recognized, the AR experience will play. In this case it is simply the cover page image floating in your viewer and the title audio file automatically playing.
The experience will continue as an augmentation of the marker image as long as your camera keeps the marker in the viewing area. You can also lock icon at the bottom of the app to keep the AR scene on you device even if you move the device away from the marker.
Now how cool is that?! I know that just one page with static image content is underwhelming. I am in the process of experimenting to find ways to allow the entire book to be easily consumed. This simple demo does show how a new and easier eBook experience can be developed and deployed.
Up until now AR experiences normally require custom development of content AND players. Certainly there is need for custom app development, but the unique approach of RealityBLU enables AR campaign to easily and affordably become part of multichannel campaigns familiar to most of you.
The platform currently supports not only images and audio files but also video files, interactive buttons, browser hyperlinks, as well as static and animated 3D models. Think of web content but not constrained to browsers.
Imagine anything you can scan can trigger AR experiences. What can you imagine? Let me know, let’s talk !