Anushikha’s Design Manifesto

Anushikha Sharma
10 min readDec 13, 2017

--

My biggest takeaways from our Human-Computer Interaction course

Introduction

In many of our computer science courses, we learn about CS concepts by the means of toy problems. The requirements for these problems tend to be highly specific and there are very few projects that involve us creating something that actual people could use in the future. It’s always about the technology and rarely about the people.

This Human-Computer Interaction course, with its focus on user-centered design, changes the way we view technology-driven solutions by placing humans at the core of the design and development. Here, it’s a little about the technology, but it’s all about the people.

In this course, over the period of a semester, we explored many different projects with many different teams, and each team had a different process for tackling the core challenge of their sprint. Based on my experiences, I’ve crafted a thesis for my manifesto -“A good design must involve an iterative approach and storytelling, and should lead to a product that fits the needs and capabilities of the target audience.”

Keeping this thesis in mind, I’ve identified 5 aspects that define my design process and would be the driving factor if I were to build a project from scratch -

1. Identification: Who are we solving this problem for?

At the start of the semester, we read Don Norman’s ‘Design Thinking’ Chapter 6’, where he talks to us about solving the correct problem. As the semester has gone by, I’ve noticed that this idea of solving the correct problem has resurfaced frequently in my thought process during our design sprints.

In How to understand problems by Andy Ko, he states that “problems are inherently tied to specific groups of people that wish their situation was different. Therefore, you can't define a problem without define who it's a problem for.” Problem identification is often the first step of any design process and the way we frame problems embodies a set of assumptions about their cause and effect, and these assumptions eventually translate into our solutions. Thus, if we incorrectly identify our target users, we capture the wrong story for our design process and we build the wrong product.

Figure 1: Identifying the target user

In ‘Why You Should be Prototyping’, Rachel Binx suggests that a good way to avoid this is by “moving away from letting requirements define a project, and instead turning to user research.” According to her, instead of simply taking user-feedback into how one can improve their experience, we must observing the existing processes, identifying the points of frustration and then brainstorm on how to make the process better.

Though for most of our projects we had already target user-groups, we used Rachel Binx’s philosophy to determine our we would determine their needs. For our first team sprint, ‘Design for Others’, we knew we were designing for religious groups but after our user research, we determined that we would specifically be designing for Bucknell’s religious communities and organizations. In our sprint on ‘Design for Well-being’, our user-research helped us determine that we wanted to enhance the user-experience for those users who were specifically listening to music stations where they couldn’t create their own playlists.

2. Brainstorm freely: Crazy ideas leads to good solutions!

For this step, I refer back to Andy Ko and his piece ‘How to Be Creative’, where he talks about the importance of generating lots of ideas, and externalizing those ideas into words, images and prototypes, because that makes it easier to translate even the craziest of thoughts into viable solutions.

Figure 2: Brainstorming sheets from our ‘Design for Understanding’ sprint

Several times I’ve experienced resistance against this form of brainstorming from students because of the ‘build and go’ mindset that we’ve developed in the courses throughout our Bucknell career. However, for the sprint of ‘Design for Wellbeing’, where my team built a Pandora prototype that used Affectiva, was one of my favorite experiences because our team began by putting down the most ridiculous ideas on the board. Once we had this pool of absurd or extravagant ideas, we tried to visualize plausible ways in which we could make them happen.

For example, Stefano said “Play ‘The Boys are Back in Town’ till the person smiles”, and that eventually led to the idea of cheering up a stressed-out user by repeatedly playing their favorite songs.

Figure 3: Putting all the crazy ideas down for ‘Design for Well-being’

Another favorite aspect of this project was the user-researching. When asked, people would often say that they don’t show too many obvious visual signs as to whether they like a playlist or not. However, when we observed students who were studying while listening to music, we realized that if a song was enjoyable, they tended to move their head around and sing along to the music. This eventually translated into us using Affectiva to determine whether a person was singing along and that was treated as a determinant of the user’s enjoyment levels.

3. Choosing a narrative: What’s the story behind the user-experience?

We’ve learned about building ‘User Stories’ through CSCI 205, Senior Design and other software engineering experiences that we might’ve had. User stories describe ‘the type of user, what they want and why’. There are many different types of user-experiences, some that are based on how the users interact with certain features of the product and some that are based on how the users interact with the overall product itself. I believe that as CS students, we are used to capturing experiences in user-stories for small features, but I’ve rarely had the chance to think about the larger narrative of our products.

Our project on data visualizations, for the sprint on ‘Design for Understanding’, demonstrated the importance of building a narrative for the entire user experience. Our team decided to compare the consumption of coal versus the consumption of geothermal energy in the United States. I was in the pair that was designing the persuasive visualization and we had to decide what aspect of the dataset we wanted to represent that would help us persuade the users of a specific narrative. We went with a narrative that demonstrated the percentage change in the relative consumption of commercial geothermal energy vs. the percentage change in the relative consumption of commercial coal. By looking at the dataset from this angle, we could hope to give our users the illusion that geothermal was taking over coal in the energy industry.

Figure 4: Our narrative could persuade the user that geothermal being consumed a lot more than coal

While our visualization effectively fulfilled the requirements of the task, we couldn’t help but reflect on how misleading our narrative had been. This design experience really spoke to the responsibility that computer scientists have when it comes to deciding what story and messages inspire their product.

A design experience where we slightly missed the mark on capturing the whole narrative was the ‘Design for Fun’ project, where my team remapped the controls to the classic game of Asteroids using Leap Motion sensors. One of my teammates rebuilt Asteroids onto Python and that version of the game quickly became really fast and difficult. Though we didn’t see a problem with it initially, project feedback helped us realize that the speed and complexity of the game are also a part of the user-experience, and not factoring that into our narrative resulted in us undermining our goals.

4. Analyzing your limitations: Hardware, time etc.

In “Designing Chatbots”, Yogesh Moorjani’s first tip focuses on the scope of a chatbot. Though he is specifically talking about chatbots, I believe that this tip can be applied to several design processes. One must analyze the scope in terms of time, expertise and limitations of resources. While we would all like to have perfect tools, an abundance of resources and unlimited time to be our most creative selves, projects in the real world hardly have one of these, let alone all of them.

There were several experiences where we underestimated the limitations of our tools and the learning curve that was associated with it. In ‘Design for Others’, my team created our mobile screens in Microsoft Powerpoint because none of us knew how to use Photoshop. However, the process of designing in Powerpoint was harrowing because to make changes on one screen required us to make the same changes, individually, on all of them. Also, Powerpoint as a tool was not the most compatible with Invision. From Powerpoint, we were only able to take screenshots of the produced screens, and in our case, the uploaded pictures were not the right dimension and left us with some empty white space at the bottom of each screen. Other groups that used Powerpoint enlarged their images, but then were left with grainy pictures and illegible text. While we couldn’t have learned Photoshop in the limited time to effectively use it for the designs during our two-week sprint, we could have spent a little more time researching other tools, such as proto.io or gimp, that would make the process of editing simpler.

Figure 5: Uploading images from Powerpoint resulted in white space in our prototype on Invision

Having learned from some of our past experiences, in the ‘Design for Another World’ sprint, my team and I spent a significant amount of time playing around with the available technology. We uploaded several random backgrounds, models and text onto A-Frame and the process helped us realize that some of our ideas were too complicated based on the limitations of time and technology. Thus, we had to narrow down our vision before we began building and this turned out to be a good decision, because we had several other forms of unpredictable technological challenges.

Figure 6: Testing out the limitations for our technology in the ‘Design for Another World’ sprint

Another limitation here was the Google cardboard and its need for a mobile demonstration. The museum scene we had built was too heavy for mobile; the number of assets it needed was too large. Therefore, we had to selectively remove objects from the setup that wouldn’t affect the user-experience too much. After we removed those assets, our demo loading time and responsiveness improved significantly on mobile and further demonstrated how hardware could limit the vision for a project.

5. User-testing and Feedback: Comparing the Before and After is important!

From Professor Peck’s slides, we learned about ‘Formative Testing’, which is a type of iterative testing to help fix usability problems. For every design sprint we did some version of Formative Testing, but the execution and results were different for all the projects.

Figure 7: User-testing during one of our initial sprints

User-Testing during the ‘Design for Tension’ sprint gave us some really valuable feedback. We had several users test out our chatbot during and after development, and we got a variety of inputs. Some of our users noted that the bot tended to provide a lot of information that the user didn’t necessarily want and its responses were seeming ‘too preachy’. Thus, we had to go through and add more questions to make the chatbots’ responses more natural and personalized to a specific conversation. We also attempted to reduce the wordiness of our bot by mostly offering questions that would allow the user to speak freely and reflect deeply. In contrast, through this project we also learned that we didn’t need to incorporate all of user-feedback into our implementations because there are as many opinions as there are people. Thus, we had to focus on the trends within the feedback and make changes accordingly.

Figure 8: Adding more questions to the chatbot to allow for deeper reflection

An experience where we didn’t use user-testing as much as we should’ve was the Pandora project. Our team tested user reactions for when they were listening to music and incorporated the observations into our final product. However, we didn’t test the final product with users outside of our team and thus, during the actual demo, there were several challenges or features that users didn’t necessarily like or understand, that could have been ironed out before. This project lacked a before and after comparison of our prototype and thus, we missed out on valuable feedback that could significantly improve the experience.

Figure 9: Wizard of Oz technique being used to determine what controls users wanted

One of the most interesting experiences with user-testing and user-feedback was the ‘Asteroids’ project. Before development, we had used the ‘Wizard of Oz’ technique to ask users what their preferred hand-motion controls would be for the game. After we had mapped our leap motion controls onto the game, we asked users to try it again for us. As mentioned in Professor Peck’s slides, we had different roles in the testing process — one of us explained the rules, one of us asked direct questions to the user about the experience and a third person made some silent observations. Astonishingly, the results were widely different from our ‘Wizard of Oz’ suggestions. The users unanimously marked several motions as unintuitive and uncomfortable, and we had to change the controls significantly. Through this project, we learned to value the iterative approach because we realized that, often, what users think they want is not what they actually want.

Conclusion

Every sprint was challenging in its own way: data visualizations and chatbots tested our creativity, while leap motions and google cardboards posed hardware-related hiccups. All in all, through these various projects, I was able to form the values that define my design process. I identified the above five pieces as the core of my design manifesto but also had to understand that based on the task and the limitations, design processes may have to be tweaked to produce more effective and useful results.

This Human-Computer Interaction course had bridged a gap in my Computer Science education at Bucknell because I’ve finally learned how to put users in the focus of technology-driven solutions. It has been my dream to be a technologist and a social entrepreneur who can create products to serve a diverse population and through these projects, I’ve gained skills that will set me off on the right path to pursue my goals.

--

--

Anushikha Sharma

Software engineer, travel enthusiast, intersectional feminist, and lover of cake