Work Culture Survey Redesign

Gabriel Thomsen
3 min readDec 18, 2019

--

As a final project for the Ironhack UX/UI Design bootcamp, I worked in a team of 3 people to redesign a work culture survey for the Berlin-based consultancy firm Young Digitals.

Young Digitals offers business consultancy, combined with a strong offering of digitization and design services. The firm and my team agreed to collaborate with them on redesigning a work culture and satisfaction survey they will use during their consultancy work at different companies.

The original prototype

The state of the product was very much an early prototype, as it was lacking visual design, and it hadn’t been tested yet.

Part I of the survey

Looking at the prototype we were given, and after usability tests, we identified some points we could improve: the UI was not yet fleshed out, as it was an early prototype. It lacked brand attributes, and a convincing introduction that could make the respondent committed to answering the survey fully. The wording of the questions was confusing to some users upon testing, and there were 2 consecutive parts of the survey with the same questions, one regarding personal values and another organizational values; the third part had a 100 point distribution system, where respondents had to try several attempts to distribute the points as they expected, and the survey didn’t have a progress bar, which made some respondents anxious about how long they had to keep answering questions. Finally, the results page presented a graph combining 6 parameters in a pie chart, which proved difficult for users to understand during the usability tests. All in all, the product we were handed over was a very useful tool, but in an early stage of development.

The original prototype’s results page

The redesign

I looked into the source of the questions used in the survey and discovered they are part of the SVS (Schwartz Value Survey), a test with broad international recognition used to assess values. However, in our usability tests, the wording proved confusing, so we came up with the idea of offering a pop-up message with an alternative wording, in case the respondent is confused. We also added an introductory message on the login screen, explaining the purpose of the survey.

To address the problem of users facing the same questions in 2 consecutive parts and anxiety about how many questions were left, we combined the first 2 parts into one, putting the 2 parameters in parallel columns, and added a progress bar at the top of the screen.

Part I of the redesigned survey

To replace the point distribution system in Part III (now Part II, as we merged the first 2), I did research about alternative formats that kept the aspect of prioritization, and found that a drag and drop ranking system would be ideal, which was confirmed in usability tests, where users found it intuitive and easy to use.

Part II of the redesigned survey, with the drag and drop ranking

For the results page, some of the solutions we wireframed were dividing the parameters into 2 smaller pie charts, or using parallel bar charts, a stacked bar containing all the parameters, and a donut graph. After all these iterations, we came up with the final solution, agreed upon with the stakeholder: a radar graph displaying a simplified version of the results, with only 2 parameters.

The new results page

You can check out our interactive prototype in the video below

--

--

Gabriel Thomsen

Business Intelligence Analyst, interested in data visualization, databases, statistics. My UX background leads me to a user-centric approach