Algorithms and Ethics

Alistair Croll
Pandemonio
Published in
3 min readDec 9, 2016

Susan Etlinger is an analyst’s analyst. At the forefront of social media marketing since before that was a term, she’s recently shifted her focus to machine learning, computer vision, data privacy, and other critically important, often-overlooked parts of marketing in our connected world. Her reports on digital trust and computers than can see are must-read documents for anyone in business today.

Susan’s speaking at Pandemonio in February, and we wanted to get a sense of what she’s thinking about in anticipation of that talk.

Pandemonio: What does “big data” mean to you?

Susan: I like the definition that the folks at Gartner came up with almost 20 years ago, that what we call “Big Data” is characterized by the ‘Three Vs”: volume, velocity and variety. The stock market is a good example of his volume and high velocity data, while social media hits on all three, with variety (video, text, audio, GIFs, emoji, and now data from virtual and augmented reality) being the most complex to deal with.

Gartner and others have evolved their thinking to account for new market and technology developments, and certainly to add a lot of nuance, but I like the simplicity of the 3Vs definition. It’s easy to understand and easy to make relevant for pretty much any audience.

P How is a data- and algorithm-driven world trampling on civil rights?

S Not a leading question in the slightest! The answer is it could—or it could be used actually to protect civil rights. The cool thing is that it’s our choice.

For example, if we look at a lot of the data in Google today, data that describes the way we humans search for information on the internet, we see a ton of bias related to race, religion, political beliefs, nationality, gender identity, sexual preference — you name it. Is this surprising? No, because it simply reflects the belief systems of the people who use it. But if you start encoding those biases into algorithms that make decisions on things like criminal justice or who gets a home loan or what news people get to see, you have a recipe for injustice.

One of the challenges with AI is that it can make it very hard to tease out the reasons for that injustice. So we need to be on notice NOW that as AI becomes more prevalent, we have a responsibility to build methodologies and data models that make technology a tool for justice rather than an enabler of injustice.

PWhat things would the average person be surprised to learn about what companies can tell about them?

S Oh, the usual. Political affiliation, sexual preference, their location, what foods they like. If you look at Christian Rudder’s book (he’s the CEO of dating site OKCupid), you may have a meltdown as you realize what information dating sites alone have. And it’s also important to realize that not everyone knows everything (except Facebook and arguably Google).

It’s also important to remember that a lot of this “data” is based on inferences about the likelihood that you are one thing or another based on your activities (posting, search, likes, etc.) So that data can be wrong as easily as it’s right. Which is more of a threat? You decide.

P It seems like every social platform starts great, then loses effectiveness. Are marketers constantly going to be stuck looking for the next big thing?

S Yup. That’s their job.

P As machines learn to see and understand video content, what industries will be changed the most?

S Oh, everything. The ability to see and understand images (whether still or moving) will change medicine, travel, retail, transportation, banking…any industry that relies on understanding and planning around human behavior.

That’s all of us.

--

--

Alistair Croll
Pandemonio

Writer, speaker, accelerant. Intersection of tech & society. Strata, Startupfest, Bitnorth, FWD50. Lean Analytics, Tilt the Windmill, HBS, Just Evil Enough.