You’re Not That Elusive.

Christina Noonan
12 min readSep 10, 2018

--

Companies know more about you than you think —
The privacy war is here, and you’re losing.

You may have heard about the recent lawsuit against Google. You may have also read about the massive data breach of 340 million individual records from marketing and data aggregation firm Exactis. As the general population becomes more literate in privacy matters, many are starting to realize how much data they have to give up in order to use the many products and services we’ve become accustomed to, from shopping for groceries to connecting with a loved one afar. This article will outline several outlets collecting user data today, how you can improve your privacy, and my commentary on each.

Maps

Imagine a scenario in which you download a map app, hit the consent button to all the legal gibberish upon first use, and then begin using it to navigate around. You decide at some point you don’t want it to collect your location data any longer so you go in and turn off location tracking which explicitly says something along the lines of “we won’t know your location if this is turned off”. Come to find out, after months or years of using that app, it was still tracking your location through other means without telling you (by recording wifi hotspot locations you connected to). How would you feel? Betrayed? Confused? In disbelief? This is the struggle many are facing today, as more discoveries are made into what information companies collect. The good news is you can turn tracking off. In addition to turning off location history (here), users can turn off Web and App Activity as well, which saves information like location, language, IP address, referrer, and more.

My Perspective:

It’s incredibly frustrating that users can do everything they’re “supposed” to do in order to protect their privacy, just for those actions to be compromised with the addition of a new update. Sadly, this isn’t a problem solely within tech. In an analogous experience, the food industry makes ingredient changes to their recipes all the time without announcing the change to their consumers- to the dismay of individuals trying to avoid certain foods. In both contexts, consumers that are attempting to retain control over their own bodies and data must stay hyper-vigilant in order to continue to protect themselves.

Email

This public enlightening has its upsides as well; Take email for example: Google ended ad-targeting through email scanning within Gmail last year after facing criticism from the public. Microsoft has said they don’t scan emails for advertisement purposes (even though their inbox was caught sending identifiable information to the company). Apple also reported that they never use email content for ad targeting, with the caveat that personal data can be accessed for law enforcement purposes or “issues of public importance”. While public pressure has (probably) helped change policies for these companies, others like Yahoo still scan your emails for ads. Good news though, you can turn off Yahoo’s email scanning. Take a look at their Ad Interest Manager; Under the “Your Advertising Choices” section, you’ll be able to opt out of Yahoo’s tracking. You’ll have to hit two opt-out buttons: one under both the “across the web” and “on Yahoo” sections. However, it’s worth noting: If Oath’s advertising plans go into effect, users may need to change a new setting option to remain opted out.

Commentary:

Email scanning has been around since Gmail first launched, but we’re glad to see users are becoming more informed and advocating for their own interests. As services become more customer-centric, transparency with clear options around sharing need to be in the forefront instead of hidden in legalese. As human-centered designers, we advocate for options that empower the user to make informed decisions. Where choice isn’t an option, clarification around what the user is supplying in return for using the product is incredibly important to inform their experience and avoid upsets in the future.

Loyalty Programs

Email and maps aren’t the only areas in which your activity is for sale, unfortunately. Despite the constant claims that your phone is always listening, many sources (more)have made the case that data brokers don’t need to listen to your every word in order to get a sense of who you are, because they collect it from everywhere else you’re not thinking about. Facebook can find you on any device you’ve ever checked Facebook on. Data brokers can collect everything that retailers and grocers know about you, and even track your in-store, cash-only purchases with the use of a loyalty discount card. Despite the treasure trove of information loyalty cards can provide to advertisers, I would gander that a majority of Americans don’t realize they’re giving that information away in exchange for 10 cents off their produce purchase each Sunday.

Further, grocery stores aren’t the only ones benefiting for your data. Individual brands want more information about you too. Fortune just published an article that dives into Nestle’s new thirst for your DNA. How many companies could find creative ways to convince you that you need their product now? “Hey, we can tell you’re likely to lose your hair in 5 years, so take this supplement now to save a few strands!” With personalized haircare solutions like Prose and Function of Beauty, we may already be extremely close to this possible future.

https://www.overglowedit.com/home/2018/7/20/review-prose-hair-personalized-hair-care

Commentary:

Out of all tracking tools, loyalty programs may be one of the most easily overlooked collection methods because it doesn’t require a digital platform or payment methods to collect information; even paying in cash can be recorded. It’s no wonder users are more frequently incentivized to sign up for company credit cards, keyring tags, and linking phone numbers: US companies spend $50 billion a year on loyalty programs alone because they can generate as much as 20 percent of a company’s profits. Awareness, clarity of privacy tradeoffs, and even reminders can be helpful to better inform and remind customers what they’re exchanging for their personal data.

http://fortune.com/2017/04/19/google-verily-health-study/

Health Services

The popularity of programs like 23andme are beginning to raise interesting questions around how this collected data could be used by entities beyond the consumer. Health insurance companies could use knowledge from these tests in their decision to cover you or even insure you in the first place. While the Genetic Information Nondiscrimination Act protects Americans against discrimination by their employers or insurance companies based on genetic information in some situations, it doesn’t protect against genetic information-based discrimination in life insurance, long-term care, or disability insurance providers. It is also worth noting that this law has never been tested in court, making it difficult to know how far its protections will actually extend in practice.

Applications like Google Health are also collecting more information about your activity than ever before. It’s worth noting that even though Google doesn’t sell your information to third parties, they don’t actually need to do that in order to make money: it sells ad targeting based on its data holdings. So, when you start receiving oddly specific ads after a recent diagnosis, you’ll know who to blame. Further, more and more programs are coming out to understand and map human behavior at a larger scale. From opting in to applications like Achievement or large studies like Project baseline, your data could be going all over the place… but hey, you could get a cool looking watch.

Commentary:

In a world where more Americans are comfortable sharing personal information with companies that literally make money by exploiting their data over their own insurance companies, industry incentives very likely need to shift to put patients first. This divide also speaks to our struggling healthcare system in the US — technology is enabling consumers to take wellbeing into their own hands, and away from the traditional practice of medicine. Perhaps the solution lies in between traditional healthcare models and one-off services to provide integrated health monitoring with preventative measures, like that of Forward Health… Or perhaps their reliance on digital records and connected devices further complicates the matter of healthcare privacy, not just from anonymous advertisement companies, but from your insurer, employer, government, etc.

https://boingboing.net/2018/05/24/alexa-listened-to-a-couples.html

Physical Products

Antiquated logic taught us that if something is “free”, we’re paying for it with something else of value (our data). Well, now, even if you pay for something, it may still very well be collecting information on you. We’re discovering that other products (even paid products and services) are also sending information back to their makers at alarming rates- from connected speakers, to your car, to even smart sex toys. Watch the following for one person’s experience diving into the deep end of the smart-home ecosystem.

Commentary:

Tracking software is everywhere… even beyond the instances explained here. The Intercept just broke news that the free NYC help kiosks may even be tracking and storing information from users across city. Short of reading every word from every consent form, we need better tools to help users understand the exchange they’re making when using products and services, especially because even paid services are extracting data we don’t always know we’re giving. The variety of privacy notices (including form, length, terms), lack of emphasis or requirement to read these documents, and the lack of clear language for an average reader, all work against the user’s best interest.

We have to do better: Right now, these forms are purposefully designed not to be user-centered. Even options to later view these documents are hidden within profile settings, company footers, or buried elsewhere within a company’s website, implying that other information is more important.

Miscellaneous Apps

Did you know that while Facebook, Apple, and other behemoths have adamantly denied recording your data, other apps are recording you? Take the notorious company Alphonso, which collects TV-viewing data for advertisers by identifying audio signals in TV ads and shows, sometimes even matching that information with the places people visit and the movies they see. Alphonso isn’t an app, it’s software. That means that it’s a component built into many other popular apps, so it’s harder to detect and opt out of using. Incredulously, more than 250 games that use Alphonso software are available in the Google Play store, while some are also available in Apple’s app store. According to the New York Times, Alphonso can also detect sounds from someone’s pocket or while running in the background. How are they able to get away with this? Ashish Chordia, Alphonso’s chief executive, said “The consumer is opting in knowingly and can opt out any time”. In other words, if you ever figure out that they’re tracking you and selling this data to advertisers, you can opt out of microphone use one app at a time. Assuming you know what apps are using this software. In any case, they don’t make it easy to find instructions to opt out on their own website.

Keep in mind, popular apps like Facebook are also treasure troves of information, even if you change your account to be more secure. Facebook holds on to everything. Unfriended friends, old relationships, all of your last employers, previous pseudonyms, addresses, etc. A spokesperson for Facebook, which employs its own facial recognition tech to help identify users’ visages in photos across the platform. It’s worth nothing that the company has stated that they don’t allow for accessing or collecting information through automated methods, such as harvesting bots or scrapers.

However, over the last five years, a secretive surveillance company founded by a former Israeli intelligence officer has been quietly building a massive facial recognition database consisting of faces acquired from Facebook, YouTube and countless other websites. This product, Face-Int, created by Terrogence and owned by Verint, has even provides information to the NSA, the U.S. Navy and countless other intelligence and security agencies. Earlier this year, Verint launched another facial recognition product called FaceDetect, which promises to identify individuals “regardless of facial obstructions, aging, disguises and ethnicity” and “allows operators to instantaneously add suspects to watch lists.” Don’t think they’re using all of this information for good either; Terrogence “conducted public perception management operations on behalf of foreign and domestic governmental clients,” using “open source intelligence practices and social media engineering methods to investigate political and social groups.” Who knows, perhaps this data will be used to help the government predict who will commit crimes… oh wait, that’s already being done.

While it’s hearting to learn that Facebook said it appeared Terrogence would violate its policies, the company has still taken advantage of Facebook’s openness to build its foundation. Terrogence isn’t the first either — Cambridge Analytica, acquired information on as many as 87 million users in 2014 from U.K.-based researcher Aleksandr Kogan to help target individuals during its work for the Donald Trump and Ted Cruz presidential campaigns. In summary, regardless of how private you think your data is, Facebook remembers everything and isn’t immune from data breaches, whether intentional or not.

Commentary:

Considering the speed in which technology is able to capture new forms of information about us, even our appearance has become useful data for surveillance. You can download a copy of your own data to see what the behemoth knows about you, but ultimately it’s your choice whether to keep feeding the beast, regardless of the consequences. It’s a bit ironic that the information these companies are using is all supplied by users freely to “complete” their profiles, from profile pictures to job history. Paired with publicly available information on other social media sites like Instagram and LinkedIn, anyone with a little knowledge about you can learn a lot more. What is to be done about this though? With all of the information about you spread across hundreds of sites, the personal-data privacy war is likely over, and you, who had little control over its dissemination, can’t re-collect it. The conversation we should be having is, what do we do now?

Last words:

Advancements like hacks to help users regain control of their data, alternative privacy-focused tools, and (in the US) FTC reporting opportunities offer solutions that chip away at the larger privacy problem by empowering users. Additionally, Europe’s GDPR, tools to help companies create more compliant websites, and published UX best practices tackle the issue from the other side of the spectrum by equipping businesses to build better products.

Screenshot from secureprivacy.ai

Legislation has gotten us closer to a user-centered solution that informs and empowers users to control their own information. Companies still have many of ways of improving. As designers, it’s our responsibility to initiate these conversations within our organizations and alongside our clients and partners. As individuals, we can foster discussions with friends and family about privacy to discuss important topics that impact our safety and negotiated cultural contracts.

If personal data is the new currency of the internet, we should treat it as such — a new commodity. Today, we don’t have the regulations to identify or legislate the asset class it resides in. We need data banks, auditing, legal specializations. We can promote other mediums of communication that facilitate cultural understanding; Art negotiates values between us and the world. The digital artist in the video below builds tools to highlight surveillance and incite big questions about privacy. These conversations are only the beginning; it’s our responsibility to continue discussions about privacy in order to promote development of more human-centered services that respect an individual’s ownership to their own data.

--

--