An Agency perspective on Learning Technology
Am I a Learning Technology insider, or an outsider? I feel like I’m probably somewhere in between, and actually I like it that way.
I run a small but well established digital design agency called Tui Media. Although we do all sorts of things outside the circles of learning technology, it’s been quite prominent in our work for something like eight years now. Because of that, I feel like I can offer an outsider’s view on the field of learning technology, but still have enough knowledge and experience to contribute something worthwhile.
To me it’s a fascinating part of our business. At the heart of it is a fiendishly complex conundrum: What is the most effective way of communicating knowledge and culture to a range of people? Should it be “pulled” by users, or “pushed” to them? Is it best done in groups or individually? Should it be through affective experiences, or does it matter at all whether information is remembered if it’s always at hand for reference? Is it really learning at all?
I’m incredibly lucky to have worked with, and be working with some significant pioneers in this field. I don’t claim to hold any expertise in the science of learning, but years of exposure to, and thinking about learning in the modern workplace have left their mark. In many respects, “learning tech” is really just “tech”, and actually I think it’d be beneficial for us to think of it in just that way. Here are some other things I’ve learned myself, along the way.
“Formal” vs “Informal” learning means nothing to end users
I’ve seen a lot of this strange splitting of formal and informal learning into silos within organisations. They don’t speak to each other. They don’t work together. They consider the two disciplines to be entirely separate.
What we see in user testing, however, is that end-users — learners — see no difference between the two at all, and can’t even adequately explain what the difference might be. They want (or have been told) to learn or know a thing. How that’s done is of no consequence to them, and they’re at best bemused, and at worst furious about the disconnect in their learning experiences.
Is it consumer-grade?
It’s my experience that users have very low expectations for corporate tech. Expectations that have formed from years, or decades of struggling with slow, old, ugly and Byzantine technology, dropped on them from on high, with smug recognition that, well, these people have to use this system. It’s their job.
There’s usually a vast chasm of difference between the experience of tech used in the workplace, and the experience that people have of the technology that they use at home, or that sits in their pocket. That technology has been forged in the fires of a competitive consumer marketplace, tested and improved, ruthlessly optimised and simplified, so that it’s chosen by more people with any choice in the matter.
And, although there’s some safety in the knowledge that there’s really no choice, this safety is, I think, diminishing. People really care about their professional development, and we see organisations trying to encourage their staff to be more proactive in their own learning. Learning technology with low quality experiences send entirely the wrong message to colleagues, and turn positive sentiment into frustration.
Naturally, the experiences you get from Instagram, Uber and Netflix, or even a consumer learning product like Duolingo, are no accident, and the technology is built by large teams of ultra-bright, well-motivated designers and developers, with healthy budgets, and user-tested over many iterations. It’s obvious that this luxury is not generally available to corporates for their internal learning tools, but great strides can still be made to close that yawning gap in experience. I’d recommend at least doing these things:
Focus on performance: Your product needs to be fast. More than a second for search results to appear just seems antediluvian to anyone who has used Google Search (which — let’s face it — is quite a lot of people).
Reduce barriers: No-one wants to have to jump through hoops to get to the information they need. Reduce the number of clicks/taps; remember what the user has done before, and act on it; fight the good fight against the need for a login at the start of every session. Simplify, simplify, simplify.
Integrate: Make sure your technology plays nicely with other technology in the domain. Users hate being bounced around from place to place, having to re-authenticate, and answer the same questions. Spend time and effort making integrations with other learning products/providers as seamless as possible.
It’s the content, stupid
Without laying down too much of a challenge to Marshall McLuhan, it’s not always the case that the medium is the message. Many times — and especially in L&D, the message is the message. One of the common mis-steps we’ve seen is the idea that by adding a whizzy digital resource centre, backed by an intuitive Content Management System, all the cool content will just be poured in. Of course, the technology is only one part of the system, and the content is too often overlooked, or under-cooked.
Working with the amazing Regan Shercliffe and Shelley Easton at the World Food Programme, we’ve been making an app to help with staff wellbeing and resilience, a subject matter that’s fraught with personal and cultural tension. We’ve spent a long time working closely with them to shape, hone and test what’s been written, how it’s been written, and the way it’s being delivered. One of the wonderful effects of this is that the app we’re making feels so much more polished and self-contained: Building the technology alongside and in support of the content plan makes for a better product.
It’s really important to test that the content is what users are looking for, too. You can do this by observing users, and a nice trick we use is to look at those search phrases that are entered with no results being returned.
Don’t ignore the content, or expect that by having a CMS, the content will just get done OK. And develop a proper content plan.
Build it and they won’t come
It’s a naïve forlorn hope that by making a cool useful thing, people will flock to it and start using it. Believe, me, in my time, I’ve built it, and they didn’t come!
Outside of checking with real users, early and often, about whether the thing you’re making is a good idea at all, I think there are three main ways to mitigate this.
Firstly, it’s better to go to where the users are. Visit them in their environment rather than making them come to you. This idea of learning in the flow of work is a hot topic in learning circles at the moment, and we’re currently working with a number of organisations building chat-bot interfaces to allow people to talk about, search for, schedule and rate their learning with a bot that sits in their enterprise collaboration software.
Secondly, design a service, not a product. Making the thing — whatever it is — is only part of the puzzle. We need to think about how that thing is communicated; how it’s discovered; its tone-of-voice; what it’s like on the first run; how it treats different user types; how it remembers returning users; how it interacts with other technologies; how it can be shared; how users can feed back to the makers about it, and so on. Decent service design will make an enormous difference to the take-up and ultimate success of a product.
And finally, give the product room to improve. Products don’t appear fully formed, and they need to be given room to be tested by users, and the results of the testing turned into changes that are then made. Work in an agile, iterative way. It’s entirely false that this is more expensive than spending a long time meticulously defining a product, and building it — to those specifications, so that it’s perfect from the get-go. With an iterative approach you can start small, using only some of your budget for the first release, and then you can listen to your users to make informed decisions about where to improve and what to do next.
Are you building for learning or performance support?
There’s been plenty of debate about the extent to which we should focus on teaching people how to do things, or on providing them with tools and resources to Get Stuff Done™. Much of the time, I lean toward the latter. This is perhaps due to my working with Nick Shackleton-Jones for several years, though it seems inherently right to me, in most situations. Of course there are plenty of reasons why “proper” training is important. You wouldn’t want a pilot thumbing through the manual of a passenger aeroplane looking for “just in time” support, and there are sometimes compliance and regulatory reasons for a different approach.
Whichever route you take, the technology design will likely need to flex to bear this in mind. For performance support an excellent search and a good, clear taxonomy is fundamental. The writing can be crisper and shorter too. For tech that focuses on the ‘learning’ side of things, where you’re looking to change behaviour long-term, the emphasis is different. Giving some structure and guidance, providing for a test of learning (a quiz, for example), and repetition all become significantly more important.
It’s important here to remember that how you measure the success of your system or tool also changes based on the fundamental requirements. Something that offers performance support should not be measured by the traditional web metrics of “stickiness” or even return visits. The goal of a resource-hub should be to provide the user with the information they need in as short a time and with as little hassle as possible. The metrics for the learning or behaviour-change side of things are quite different of course.
Attitude is everything
If you want to use a creative digital agency to help design and build your learning and development capabilities, my hard-won, battle-hardened advice is that you’ll get the best results if you go into it with the right attitude. To my mind, this means:
Be prepared to experiment. In fact, make a firm decision to try some experiments, and get senior buy-in for this approach. The angle is that it’s highly user-focused, relatively low-cost (in comparison to a large SaaS off-the-shelf LMS) and highly targeted. It can also produce demonstrable, measureable results really, really quickly. Working with PA Consulting and the incomparable Gemma Critchley at Aviva , we put together a resource hub for Aviva in around 5 weeks at the end of 2017, which is still going strong now.
Collaborate deeply with your agency. Don’t provide a statement of requirements, then sit back and wait for the results. Ideally, get your agency in early. The earlier the better. The most collaborative of our clients — people like Michael Kibblewhite from BP, or Grant Schmidlechner from GSK — come to us with a problem to solve, or a question to answer, and work with us to develop the right tool(s) for the job. From there, it’s important to be invested in the product, and to guide its build. Gather a working group of willing end-user testers. Be involved in the decision-making throughout. Be willing to shift course if there’s good reason to. The creative process is honestly much more fun that way!
Make sure you think about the entire product lifecycle. The building of a product is only the start of its life. It will really start living once it starts being used, content gets added, changed, updated, and people start feeding back their comments about it.
When planning the commissioning of a tech product, make sure you factor in its overall lifecycle. You don’t need to necessarily know precisely what that lifecycle is, but you do at least need to know that there will be one. Easy things to do here are:
- Anticipate and plan for user-testing and the need to make amends based on the results of the testing.
- Consider a limited roll-out initially, then think about how you’ll roll out to a wider community.
- Keep a backlog, and develop a wider product roadmap.
Predicting the future of anything is a fools’ game, so I’m going to keep this brief and fairly close to the present day! I have two thoughts about how learning technology will work in the near future.
Firstly, I think the dilemma of “buy vs build” is misplaced. I strongly agree with my friend Myles Runham that the one-stop-shop is a fallacy. (I would though, wouldn’t I?) and I think that the right approach is to make informed choices about when to buy, and when to build. End users are very well used to multiple tools and techniques to help with their learning, and offering a set of tools, preferably with some nice, standardised interoperability, seems to me like the best way to avoid painting yourself into a corner.
More interestingly, it’s my job to keep an eye on how emerging technologies will impact our clients, and I think the biggest shift we’ll see in learning is the development of anticipatory systems that use information about the learner’s context to provide information. Google Now is doing this for people in general already, and the big tech firms are busy building affordable services to allow developers to use some of the very complex machine learning, natural language support and sentiment analysis that’s happening at the vanguard of digital technology. The next step in learning is for systems to be able to anticipate what people need, for example by being able to understand and contextualise a calendar, and to offer these things before the learner even knows they’ll be relevant. This’ll happen through machine-to-machine APIs, with the hard work being done by the systems, all to reduce the load on the human.
I intend to be busy working on designing and building this next generation of learning and development tools as part of the work we do here at Tui, and I’m really looking forward to seeing what we can do for our current and future clients.