Don’t Be a Jerk

As creators of digital products, we are partly responsible for our users’ privacy. We have to use our common sense when we collect and use personal data.

Tim von Oldenburg
4 min readDec 10, 2013

--

Collecting Data is Necessary

Large and small businesses alike have a reasonable interest in their customers’ data. There is no way to deny that, in a market-driven economy based on supply and demand, we need to collect demographical and behavioral information; to create new products, enhance existing ones, and—effectively—for a business to stay alive. If businesses were not able to collect customer data, there would be a high chance that your most beloved products would not exist.

Everyone involved in creating a digital product is at the same time involved in data collection, no matter if you are a designer, developer, project manager or businessperson. And all of us are responsible—for what happens with those data, and if they are collected and stored in the first place.

The problem is that there is no ultimate codex for collecting data, there are no rules other than the privacy laws and regulations of your respective country. I don’t want to dive deeply into the topic of ethics, and I won’t talk about the NSA. I just have one simple request for all of us: Don’t Be a Jerk.

Consent is not a Blank Check

In most countries we are required by law to ask the user for their consent, or at least inform them about our data-collecting activities.

So, if the users have given me their consent to collect and use their data, why would I need to care more? — Evil Product Stakeholder

That is a valid question, indeed. Here’s the thing: Users don’t share all information consciously and willingly. Let me illustrate this with three examples.

The tl;dr effect

This situation is well known to all of us. Have you ever read an End User License Agreement in its whole? Me neither. Users become blind to long texts with lots of details, especially legal texts, but also to repetitive permission requests.

Android’s permission management tends to get lots of praise, although it is far from perfect. But displaying a dialogue that lists all the permissions before installation makes the user blind to it in the long run. Users begin to click ‘accept’, no matter the permissions. The same applies to Facebook and Twitter apps, as well as the plugin ecosystems of applications like Firefox and Chrome.

People don’t give you permissions to store cookies because they think it will be beneficial to them and you—they do it because they want to get rid of your annoying popup.

The gist is: people often give their permission to collect and use their data without really thinking about it.

Complexity kills

Facebook is a prime example for intransparency, especially when it comes to privacy settings. Mike Monteiro gives a shocking example of the horrible consequences our (design) decisions might have in his 2013 talk, How Designers Destroyed the World.

The Facebook privacy settings are so complex and confusing that most people give up and don’t care about them. And those who do and successfully manage their private data can not feel safe still. This causes people to lose their trust into an organization.

The gist is: just because there are means for the user to control their data, it doesn’t mean they can be properly used.

“I have no idea what I’m doing” (but you do)

Nowadays there are ways for users to control the flow of personal data, but they often require some knowledge about the technology being used. You don’t expect everyone who drives a car to also be able to repair it, do you? Similarily, we cannot expect all people on the web to know what cookies are and how to block and delete them. We cannot expect users to just use Incognito Mode when they want privacy, and neither can we expect them to know how OAuth2 works.

Also, users have to put their trust into your product. If the permissions dialogue of a Facebook app asks for the email-address only, they expect Facebook to share that and only that—and not also their Likes and Friends. Even when users know what they are doing, they still don’t know what the company is doing. They have to trust your organization.

The gist is: be as transparent as possible, educate your users accordingly and don’t abuse their trust.

Use Your Common Sense and Talk to Your Client

In a project I worked on in 2013, we were using Google Analytics to track click streams as well as general demographic information of our users, provided by Facebook login. The application was designed using a shopping cart metaphor.

Each item put into a shopping cart was stored alongside the information we got from Google Analytics and Facebook. That way, we were able to track down every single click a Facebook user did, and every item they would put into their shopping cart.

This was when I first developed doubts about the users’ privacy. It just didn’t feel right, because our tracking was so detailed. We clarified with the client how the data were really meant to be used, and it turned out that they wanted to find out about trends for different demographic groups. With this information, we were able to anonymize the stored data sets; it was still possible to see trends, but it was not possible anymore to track down one specific user. Problem solved? Not yet. We also stopped saving data for people younger than 18 years.

Morale of the Tale

Always remember that the user may have given you their data without their full awareness. Always think of the user as your teenage sister—what would you want people to do with her data?

I don’t want to cite Google when it comes to privacy as a topic, so I changed their mission statement a bit. Take responsibility in your job, and Don’t be a Jerk.

--

--

Tim von Oldenburg

Interaction Designer & Frontend Developer, passionate about Music, UX and Coffee. Loves Traveling.