Is agile overrated?
Technology and design trends change over time, and so do the ways in which we collaborate to create them. With a lot of social media hype, “Agile” has become synonymous with “the right way”. But are other workflows really inferior? Let’s take a closer look…
“Agile methodologies divide design processes into rapid iterative cycles that make changes based on live user feedback from the previous rounds of iteration to speed up delivery and reduce costs.”
In the beginning, it was extremely difficult to develop software because of the limited technology and libraries available at the time. And after a high cost of development, companies also had to spend a lot of money on getting their software into stores, because there were no app stores. This meant that we had to buy physical floppy disks or CDs (insert audible gasp from today’s youth). So because there were no crash reports and reviews to evaluate from an online marketplace, companies had to conduct expensive studies before they were even aware of any bugs. And if there were, they had to repeat this whole process and mail an Update CD to everyone who bought their product. As you can see, it was crucial to ensure that products were near perfect at launch—meaning that it took a long time to produce anything.
Thankfully, this is no longer the case. Easy access to the internet has resulted in significantly lower costs of development, distribution, and evaluation. These developments have paved the way for a faster — and thus cheaper — approach to product development known as Agile. Thanks to online marketplaces and new tools, developers can now release products much earlier and get live user feedback. This means that developers only need to fix what’s broken and add what’s truly needed.
We’ve even reached the point where it’s normal to expect a major patch for a video game on its first day of launch!
At first glance, it might seem like a messy process, but in fact, it is very similar to the one we employ in User-Centred Design (UCD). While there are a few differences between them, both emphasise foundational research, user feedback, and team collaboration.
In a useful paper by Chamberlain, S., Sharp, H., and Maiden, N. (2006, June) called “Towards a framework for integrating agile development and user-centred design”, they outline five principles of a user-centred Agile approach:
- User Involvement: The user should be involved and represented in the design and development process.
- Collaboration and Culture: The designers and developers must be willing to communicate and work together extremely closely, on a day to day basis.
- Prototyping: The designers must be willing to “feed the developers” with prototypes and user feedback on a cycle that works for everyone involved.
- Project Lifecycle: UCD practitioners must be given ample time in order to discover the basic needs of their users before any code gets released into the shared coding environment.
- Project Management: Finally, the Agile/UCD integration must exist within a cohesive project management framework without being overly bureaucratic.
By examining the paper, we see that Agile methodologies aren’t ignoring foundational research and other practices that we rely on in UCD. If the team is structured correctly and is comfortable with Agile methodologies, this could be a great way to maintain a user-centred design perspective while launching faster. So, how could using Agile methodologies be a bad idea?
“Agile’s biggest threat to system quality stems from the fact that it’s a method proposed by programmers and mainly addresses the implementation side of system development.” — Jacob Nielson
While agile doesn’t change the design process, it does require different design and development strategies in order to speed things up, which can negatively impact the user experience.
One fundamental practice in the agile approach is live prototyping (launching soon and making changes live). Live prototyping aims to help companies learn from the real failures of real users in the real world. This gambit usually pays off in industries like video games and e-commerce with low criticality, since the benefits of live user testing far outweigh the risks. But there are many areas of application that can’t tolerate taking major risks. Let’s look at a few examples.
- Finance and healthcare industries often have critical systems with high stakes. Failure in these areas of application could mean that someone loses their savings or even their life. In these situations, the risks far outweigh the initial time and resources required to launch a more stable and user-friendly product.
- Agile practices like live prototyping also rely on technology that’s easy to update. With devices that have an internet connection, this process is often quite simple, but there are many situations where it might be more challenging. Besides the fact that many types of interactive technology, such as kiosks and many modern-day cars, need to be manually updated by technicians, a lot of people don’t have access to cheap high-speed internet. So, finding out that you need to download a 25 GB update before you can play your expensive new video game doesn’t make for a great user experience.
- The last point I’d like to address is designing for emerging technologies. Although tech like AR/VR, and smart devices, for example, are becoming more accessible, we still don’t have access to a large collection of libraries and tools for rapid and cheap development. Aside from these challenges, many of them also call for developing expensive physical devices that cannot be updated easily to fix bugs. This means that it may be wiser to spend extra time on the development phase in these areas of application.
The user’s learning curve can also become an obstacle. There is a lot of previous experience that users can draw upon for conventional websites and apps, but the majority of people have never used emerging technology like AR and VR yet. In other words, it may take a great deal of research and evaluation before it is possible to just launch a “basic” interface.
There are more things to consider before deciding on an agile approach. There are even different agile methodologies like SCRUM and XP to choose between. But the point of it all is that we need to make a calculated decision before implementing an agile methodology. If Agile is truly going to add value to a project, we need to evaluate the risks and special needs of each case individually.
There are certainly many benefits to using an agile approach when it’s appropriate, but other workflows will likely become popular again as we transition into new ways of interacting with technology.
Follow me on LinkedIn for more design and tech-related content, or learn more about me through the links in my bio.