Is GPS data in your photo good or bad?

I love data and I love the Internet. Because I love people. So much good stuff happens as we have access to more knowledge and more people to interact with around the globe. When we use data to enhance our artefacts we also make them easier to find, easier to use in data analysis and more relevant in visual applications. And then… there’s always the dark side.

To be seen or not to be seen, that is the question.

Location-based information in photos helps us place photos on a map, showing us exactly where the photo was taken. This is great for handling large sets of photos and easily identifying time and place. With a voice activated device in my home I may speak out-loud “Hey Artie, show me photos from my trip to London in November last year on the living room TV” and within seconds a slideshow of the correct images would begin playing.

Just navigating around your photos using a map as your interface is great fun and can spark wonderful memories.

In this context, GPS data is easily deemed as good.

The dark side becomes apparent when someone with ill intent gets hold of you GPS-location and can use it to travel uninvited to your house, send other people uninvited to your house (even SWAT teams) or combine it with other data that allows them to steal your identity or fool your friends into sending them money. Another variant is doxing, which refers to a practice of researching and publishing private or identifiable information on the Internet.

The service I built to illustrate this point, DickPicLocator, brings attention to the paradox of good and bad in relation to reading the geolocation of specific photos. Sending unsolicited dick pics is unequivocally bad behavior. In this case finding out the location of where the photo was taken can assist in identifying the perpetrator. Hence the recipient of the photo is using detective work to track down someone committing an offense and the intent is readily perceived as good.

Thus, it’s good because we can expose perps, it’s bad because we can put innocents at risk. This means every single individual on the planet has the potential of using the same tool for good or bad. As is, of course, the case with most tools, even the knife next to your dinner plate.

The growth of open-source intelligence

Military defence analyst H I Sutton recently published an article outlining how public webcams can be used to track submarine operations. He refers to open-source intelligence (OSINT) as “democratizing the intelligence arena”. Pretty much anyone “can gain some useful intelligence without the investment or risks associated with traditional intelligence”.

Each and every individual with a smartphone is amassing their own vast personal data sets of digital information. Much of this data can then be recovered by other parties, organisations and individuals for mining, processing and predictive analysis.

From a military and national security perspective this is obviously worrying. Anyone with a decent computer can begin mapping out buildings, bridges, power plants and cell towers just by scraping the Internet for pictures uploaded by the country’s own residents.

In a world where mere individuals can do this to gain leverage over other individuals it’s utterly difficult to pinpoint who is acting on who’s behalf. It’s as if the cold war has zoomed in to affect each of us on an interpersonal level. If one person can harbor as much open digital information as a whole country only 30 years ago, what steps are being taken to gain access to that information? Who is targeting whom?

With all that Facebook knows about me, how willing would I be to go to court with a lawsuit against them?

The real problem

To me the problem is not whether the GPS data is there or not. The problem is the lack of transparency and the lack of general understanding of what information is being “leaked” from a smartphone.

  • There is a lack of transparency on the part of the mobile phone manufacturers as to what information is stored in photos.
  • There is a lack of transparency on the part of software and Internet companies as to what information they save, and strip away, when I share photos.
  • There is a lack of concern on a government level for the public awareness of volatile, digital data. Something that in all fairness should accompany a national transition into a digital society.

Just as many people today hardly know how their car works, most people also don’t know the first thing about how data travels across the Internet, what information can be seen, how it is stored or how it is protected. Just as people trust their car manufacturer and mechanic to ensure that their car is safe for road travel, they tend to trust that somebody won’t sell them a phone that isn’t safe for Internet travel.

To insist that people should know, should read up or shouldn’t use something they do not understand will simply not hold up as an argument. It’s becoming increasingly hard to participate in society without a phone today. By locking many public services into digital channels governments are forcing people into an online space whether they like it or not.

Nobody, however, is assuming responsibility or accountability for protecting the data that most people in all likelihood are not aware that they are sharing.

Responsibility and trust

When we talk about privacy the question should not be: “Do you have anything to hide?”. The question should be “Do you have anything to protect?”

We can then go on to discuss whether or not I trust individuals and organisations with the knowledge of what I wish to protect. And then let’s assume the trust is there; In a digital world I still have no way of knowing where my data ends up. Copy upon copy of my personal information can be made without my knowledge. It is in many places at the same time. The copies can be made when listening in on a public wi-fi router or they can perhaps be made by a disgruntled employee at one of my service providers who simply exposes my data online before quitting their job. Thus triggering thousands more to make copies of that data. Any chain of data transmission is only as secure as its weakest link. And online there are a great many such links.

Sure, we could tell people to turn off geolocation completely on their phones. But this is incomplete inductive reasoning. The GPS location in your photos is just one of many ways that people can track location online. And, as I have explained, there are many potentially good ways of using GPS data for the benefit of learning, for the benefit of finding justice and for just plain ease-of-use when you have millions of photos.

But we could do with a little more friction please.

False sense of safety

Many will consider my call to awareness about photo data and assume that I recommend more people should turn this functionality off. I wish it was that simple.

Considering Moore’s law and how computer power has been growing at an astounding rate, I am not equipped to predict what networked computers will be capable of finding out about you five years from now, GPS location or not. I would not wish to impose upon anyone a false sense of safety.

Data, you see, is sometimes stored where we least expect it.

A fired bullet had a distinct set of scratches and ridges that can identify the gun it was fired with.

Remember when you last saw a CSI show where a bullet has been recovered from the crime scene. The forensic experts then retrieve guns from suspects and fire them in the lab in order to compare these bullets with the original one. By matching the unique scratches and ridges caused by the inside of the barrel when the bullet is fired, they are able to identify the gun that was used at the scene of the crime.

This technique can also be used with digital cameras.

You see, no two digital cameras are the same. The blog 33 bits of entropy explains:

Microscopic physical irregularities due to natural structure and/or manufacturing defects cause observable, albeit tiny, behavioral differences.

What this means is that there are tiny irregularities in all camera sensors, rarely visible to the human eye but distinguishable by computers. In the same way that the gun barrel has unique distortions, the unique sensor noise means that the exact same irregularities will appear in every photo taken with that specific camera.

Let’s apply this to the example of unsolicited dick pics. If someone takes a photo of their penis and sends to an unknowing recipient, the metadata containing the GPS coordinates may well be stripped by the software, or chat program, used to send the photo. However, the unique irregularities found in that photo have not been lost. If that same camera has been used to take pictures of sunsets and smiles then a computer program could well scan those photos and identify them as having been taken with the same camera. And the forensic examiner will thus not even need access to the original camera, only access to a large database of public photos, otherwise knows as the Internet. And, as you will have guessed, the forensic examiner will probably not even be a forensic examiner at all. Just some random concerned citizen.

Conclusion: No GPS coordinates? Still busted. A reality that looks to be true within a few years, if not yesterday.

The case for friction

In my years working with design it has become evident to me that as humans we are wired to choose the path of least resistance. I tend to use this to my advantage when designing, also making the case that less friction will cause less pain for the user.

A simple example would be that the less friction a person encounters in a shopping experience, the more likely it is that they will complete the purchase. Organisations spend lots of money on multivariate testing to deduce which of several similar designs will lead to a completed purchase.

When it comes to sharing artefacts online the race to reduce friction is there as well. We are constantly prompted to share by large buttons and calls to action asking what we did today. Not one of these seem to be asking, “Is there anything you’d rather not share?”.

As frustrating as it may sometimes feel, friction has one winning quality: it forces humans to think. And if there’s something we need more of today, it is considered choices — not more choices made on a whim.

I don’t believe I can expect companies to willingly become more transparent about their data management, which would mean more friction and less closed deals. But I would hope that governments have more of a keen interest in educating and protecting the interests of the people they serve.

For the time being, it’s full speed ahead. A bonanza of data is being shared for the picking of anyone who happens to be listening, and perpetually brawnier computers are welcoming the challenge.

Right now I’m not even sure this train has an emergency brake. Or where my trust should lie.

Could someone please close the window? It’s getting cold in here.