At SMK, we’d really really like to give guests easy, flexible access to information on the collection. We’d also really really like to avoid all the distractions of QR codes and dubious Bluetooth connections.
With this in mind, we started collaborating with a group of Danish developers working on an app they call Vizgu (a combination of ‘visitor’ and ‘guide’ — not entirely sure about the z). Vizgu lets the visitor point his/her smartphone camera at an SMK painting and thus call up metadata, descriptive text and audio where available.
This image recognition is made possible by storing all our photos of art in the Vizgu database. When the guest takes a picture of a piece, the app compares this photo to everything in the database and (almost always) finds the correct match in a matter of seconds.
There are a few reasons why we — in all modesty — think this is a Highly Appropriate Solution(tm).
Puts all processing where it belongs
Giving guests easy access to information is a complex task — i.e. it involves complexity which can be delegated to different stages in the process.
A code of some kind next to the artwork puts the complexity on the guest who has to (for instance) activate a QR reader (difficulty level: 8 out of 10, particularly for iPhone users) or type in a URL in the phone browser (difficulty level: 5, doable but not pleasant).
Bluetooth transmitters like iBeacons is a way of relieving the guest of the hard work. It lets the room do the processing, if you will. Even assuming that the calibration works well, this solution involves telling the guest about Bluetooth in some form (you, clever reader, surely know all about Bluetooth but think about the rest of your family). This solution also involves a somewhat complicated hardware setup that needs to be maintained by museum staff (also on weekends, especially on weekends) — and upgraded from time to time = $$$.
The Vizgu solution, on the other hand, leaves the heavy lifting to a high-powered computer (i.e. your phone) and you don’t have to know anything about Bluetooth, browsers, or databases to use it.
Uses well-known interaction paradigm
… which is just to say that telling guests to simply use their camera is UX heaven. Mostly everybody knows how to point and shoot. You have to find and install the app (a barrier in itself, to be sure) but after that things are as simple as they can probably be.
Uses a non-rival form of interaction
Using Vizgu does not compromise any other type of art appreciation. No mounted screens, no (potentially) ugly codes or beacons — in short, nothing do detract from traditional ways of using the museum. Best of both worlds.
Rhymes well with our strategy
We are museum people with very limited developer resources. Building, maintaining and constantly updating even one app (let alone one for several platforms) would instantly drain our resources. It would not be a good use of our time.
But more importantly, our strategy (most obvious in the SMK Open project) is to spend our time making SMK work as a platform. This involves making all of our data/content as flexibly available as possible — for others to use, share, and build upon. We do not want to be app developers, we want to be providers of data, content, and knowledge. We want to enable anyone with a good idea to use any portion of our data to enrich their website, app, or service.
In this way, we arguably outsource the development and maintenance of SMK-related services freeing us to work on the quality of the data itself. Sure, this also means outsourcing the user-experience and a loss of control over context and presentation. But we still have full control over what to recommend to our guests and happily observe that external developers tend to welcome our suggestions.
This has certainly been the case with Vizgu whom we’ve been more than thrilled to work with. We look forward to formally launching the app in September 2017, and you, dear reader, are more than welcome to join us (link to Facebook).