Case Study And Reflection: The Clouds And Twinkly Lights Project

Jasmin Li
FIxD
Published in
5 min readFeb 7, 2018

Case study

The Clouds and Twinkly Lights project aim to display the updated daily and weekly numbers of how many people taking the stairs and the elevator through an ambient setting in an office building at the Open University in the UK. And a specific goal is to suggestively nudge people to choose to take stairs at the point of a branch for a healthier life.

Figure 1: The Twinkly Lights design

The Twinkly Lights(Figure 1) was designed to attract people to take the stairs. People’s steps would generate the lights and attracted towards the stairs by the moving lights. The Clouds (Figure 2) consists of grey balls and orange balls. Grey balls represent the number of people who take lift on that day, orange balls represent the number of taking stairs, which both can move according to the data collected about numbers of people.

Figure 2: The Clouds hanging in the atrium

The History display (Figure 3) was designed to connect to the cloud display, show the more visualised data about percentages of lift takers and stairs takers by using a set of pie charts.

Figure 3: The History display shown at the entrance to the building

At the very start, a specific aim was asked to help evaluation these three displays if they can effectively nudge people to change their behaviours. So in situ study, the three displays were placed in the building in order to receive evaluation information from building users. To better know and analyse the data they received with feedbacks from people, a set of sensors were placed under the specific carpet to calculate a representative profile for diverse periods of time.

Besides of the technological information collection, a mix of standard data collection methods was taken into consideration and conducted. The first thing is to do observation. Observation is the very start and very basic measures for evaluation. Two kinds of people were noticed and recorded, one is people who noticed these installations and people who still took lift after the ambient displays set. Their actions can reflect and evaluate the questions addressed before, how noticeable the installation is and whether people’s behaviours were influenced in the manner they assumed. There is no doubt that the second step could be interviews, interviews is a very efficient and helpful method to obtain deep information and evaluation, sometimes even bring “bonus” for people’s opinions.

Qualitative information collection method comes with the quantitative method. An online survey may not bring very deep and considered feedback, but it can make designers receive more accurate opinions which can also be scalable enough to cover all target audience especially in response to the question addressed how well it fits into a building, to make results analysed more subjectively and conclusive, avoid occasionality. The last step was checking logged data of stairs and elevator usage, which could be the most accurate and direct method to check if these designs work as assumed and if achieving expected expectation.

Reflection

Figure 4: Ezi-worm installation set in the temporary community garden

Looking back to our project, Ezi-worm, is to promote people to contribute to a community garden, makes themselves feel involved in sustainability. To evaluate if our design truly supports the human value and achieves expectation, some information needs to be noticed and recorded.

1.How do people react to the installation before we introduce it to them?This question is very important because it decides if the installation would be attractive enough to make people notice it when it’s actually set in the community garden. No one would explain this installation to them and they need to figure it out by themselves mostly.

2. How people think about this concept, especially would they like to put food scraps in when they have leftover? Attitude towards this facility depends if people would actually take some action to make a contribution. If people take it not a convenient and helpful facility, they may not keep long-term interest on it.

Figure 5 Display box for showing information about waste capacity and container temperature

3. How many questions did they ask during the process? And what kinds of questions did they ask, technology or concept? The more questions they asked means the harder they can understand how to use it generally. But also we need to consider if those questions are about technology, if so, these questions may just mean they are curious about how this installation work. But if the questions they ask are about the meaning of flags, how to use the container, or even what is the meaning of the display box about container capacity and temperature(figure 5), which could mean the display designed is not as understandable as expected, which would reveal some serious and important problems about the facility, promotes us to reflect and improve it.

4. If they can understand how to use it after we explained everything to them? People are very easily to make a wrong decision if they receive wrong messages, which requires us to explain our display to our interviewees in case they misunderstand it and provide inconsistent and unreliable feedbacks.

Never miss evaluation during the design process due to lack of time! A project evaluated by the potential users and relevant design expert would provide us many valuable feedbacks to help us improve our design, to make the design to able to go through a test by the market and public. What’s more, the method used for evaluation should be diverse, to make sure the information we obtain is reliable and objective.

--

--