Privacy in the Digital Realm
What if someone tells you that your Smart TV has eyes, or that your iPhone has ears? You’d probably laugh it off, right? If you’re an Apple user, you have probably made use of Siri, the digital assistant in iOS, in one way or another. By simply saying “Hi Siri” to your smartphone, the voice-activated feature is automatically set in motion. That means that Siri’s microphone is constantly listening, awaiting that command. What is to prevent Apple from making use of what it hears while waiting, or an identity thief from breaking into that mobile device?
Before delving into the privacy risks that smartphones pose, it is essential to define what privacy means, or at least attempt to. The concept of ‘privacy’, as in ‘the right to privacy’, can be understood in a number of ways. This multitude of potential meanings and uses is partly why the concept is controversial, confusing, and perhaps even contradictory (Solove, p.42, 2011). In ancient times, privacy meant isolating one’s self from the public sphere, which was considered as a form of punishment or deprivation. Later, specifically in the early 19th century, privacy was correlated with maintaining relationships between people. However, in 1890, Warren and Brandeis found that privacy is more concerned with autonomy, that is, allowing people to control their own self-identity, rather than allowing it to be exploited (Solove, p.41–44, 2011). Autonomy is essential for sustaining democracies as it develops individuality, independent thought, diversity of views and non-conformity. Autonomy is also part of the broader issue of human dignity, that is, the obligation to treat people not merely as means, but as valuable and worthy of respect. It secures the protection of an individual’s personal and private information, which can only be used with this individual’s consent. Against this backdrop, a basic level of trust is established, which is a key element for conducting human affairs and facilitating social interchange, be it personal or functional (Solove, p. 44–47, 2011). This concept of privacy is only based on a human scale. But as smartphones become more ubiquitous around the globe, the term ‘privacy’ gained a new definition.
You might think your life is not worth tracking as you browse websites, store content in your iCloud, and post updates to social networking sites. However, the data you generate is a rich trove of information that says more about you than you realize, and it’s a tempting treasure for marketers and law enforcement officials alike.
The Cloud is not Transparent:
A study by the mobile security company ‘Lookout’ found that ads from advertising networks running on some applications may change smartphone settings and take contact information without a user’s permission. Once an app is installed, owners then have easy access to certain data on that phone. Even though free versions of applications are the most downloaded, owners make a large amount of money by selling data to advertisers (Kim, 2012). This data includes the individual’s username, password, age, gender, contacts, location, phone ID and number: all of which are details the user has previously and willingly given up to the “cloud”.
Say Goodbye to Easy Alibis:
The smartphone is the government’s ultimate tattletale. As long as your phone is turned on, it registers its location with cell phone networks and downloaded applications, and updates it several times per minute. In October, police officials used a social media monitoring program, Geofeedia, to track protesters after the death of Freddie Gray in Baltimore (Brandom, 2016). Geofeedia allows law enforcement or private companies to aggregate and search event- or location-related posts across mobile services, including Facebook, Twitter, Instagram, and YouTube. This story inspired outrage, and its use by law enforcement was found to be the most controversial; police went so far as to use the tool with facial recognition to identify protesters with outstanding warrants. In a statement, Facebook claimed that it wasn’t aware of Geofeedia’s law enforcement contracts, and refused to comment on how it used this program on its platform (Brandom, 2016). This raises real questions about how closely the company is guarding its users’ data.
Apple Fights in the Name of User Privacy:
After a lengthy debate between Apple and the FBI on cracking an encrypted iPhone used by one of the San Bernardino shooters, investigators eventually gained the access they needed without Apple’s help (Kharpal, 2016). But that doesn’t mean that the privacy debate is over. Apple took an oath to never put costumers’ data security at risk. In a statement, CEO Tim Cook says, “The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.” Creating a ‘backdoor’ for an iPhone, or “a version of iOS that bypasses security”, results in not only dangerous consequences, but also an internal conflict. First, it would be far more difficult to prevent those cracking tools from escaping into the wild or being replicated or misused (Cook, 2016). Second, the same company has to work to both secure its products and to undermine that security; the better it does at the first job, the larger the headaches it creates for itself in doing the second. This can be problematic on so many levels.
A Father’s Wish:
A grieving father, Leonardo Fabbretti, in Italy wrote to Apple’s chief executive Tim Cook begging him to unlock his deceased son’s iPhone so he can retrieve the photographs stored on it. He also mentioned that he would eventually turn to the elite of the hacking world, the company that allegedly helped the FBI in the San Bernardino case, in case Apple refuses to cooperate. Apple sought to help Fabbretti by using methods that won’t infringe on it’s privacy regulations, but to no avail (Goldman, 2016).
Americans and Europeans have different starting points and understandings of what counts as a just society. American privacy law is caught in the gravitational orbit of democratic liberal values (Solove, p. 44, 2011), while European law is caught in the orbit of dignity (Brussels, 2010). They are also constantly pulled in different directions because these two legal orders really do meaningfully differ: continental Europeans are consistently more drawn to problems touching on human dignity, while Americans are consistently more drawn to problems touching on the depredations of the state (as seen in the Bernardino case). However, despite the obvious contrast, these two laws have proven to arrive at the same conclusion: Apple HQ in the U.S. and in Italy were able to come up with the same decision that distinguishes between national security and human empathy on one hand, and legitimate privacy concerns on the other hand.
Apple’s Stand: Legitimate Concern or Moral Panic?
One can argue that in the age of the smartphone, privacy is almost non-existent. Information is being monitored, shared, sold and exploited. But users willingly offer all this information simply because they are unaware of the consequences. Therefore, the first step to solidifying the concerns over privacy is educating the users on the drawbacks of sharing this information in the first place. The latter will strengthen the ‘ecosystem of trust’ that the global market for both traditional computing and the new breed of digital appliances critically depends on. This ecosystem requires developers to push out security updates by creating security codes that cannot be breached, even by government officials. Consumers would be even more distrustful if tech companies like Apple submit to the the government’s demands to develop ‘spyware’, disguised as a means for preserving national security, when it is a clear infringement of the right to privacy.
Therefore, it appears that the stakes are higher than allowing the FBI to decrypt a dead terrorism suspect’s phone, or fulfilling a grieving father’s wish to virtually reunite with his late son. The real concern is whether tech companies would eventually undermine global trust in their digital devices; then, there will be a hefty price to pay.
Solove, D. J. (2011). Perspectives on privacy. Understanding privacy (pp. 40–77). Cambridge, MA: Harvard University Press
Kim, E. (2012, July 18). Mobile ads can hijack your phone and steal your contacts. CNN Money. Retrieved from http://money.cnn.com/2012/07/10/technology/mobile-ad-networks/
Brandom, R. (2016, Oct. 11). Facebook, twitter, and instagram surveillance tool was used to arrest Baltimore protesters. The Verge. Retrieved from http://www.theverge.com/2016/10/11/13243890/facebook-twitter-instagram-police-surveillance-geofeedia-api
Cook, T. (2016, Feb. 16). A message to our costumers. Apple. Retrieved from http://www.apple.com/customer-letter/
Kharpal, A. (2016, March 29). Apple vs fbi: all you need to know. CNBC. Retrieved from http://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html
Goldman, D. (2016, April 12). Grieving father optimistic he can crack his dead son’s iphone. CNN Tech. Retrieved from http://money.cnn.com/2016/04/08/technology/leonardo-fabbretti-iphone/
Brussels (2010, Nov. 4). A comprehensive approach on personal data protection in the european union. European Commission. Retrieved http://ec.europa.eu/health/data_collection/docs/com_2010_0609_en.pdf