“Apologies” for our humanity

Sorry not sorry: Abzu’s cookie policy

Elyse Sims
Abzu
4 min readJul 2, 2020

--

When cookies were the progeny of “magic cookies”, they were seemingly innocuous packets of e-commerce data that stored a user’s partial transaction state on their computer. It wasn’t disclosed that you were playing a beneficial part in a much larger system, and you didn’t have a choice on accepting your role.

Imagine a stranger slipping a packet into your pocket. And then that stranger regularly slips their hand back into your pocket to check on it, because unbeknownst to you that packet has the power to gather and store all sorts of information about you. For over two whole years, some (ingenious) weirdos were doing this virtually unnoticed until the Financial Times reported on the phenomena in 1997. Unnerving? Sure. Yummy? Nope.

Nothing new: Bumping into things on the internet that make you uncomfortable.

Today those weirdos are a lot more obvious: I bump into them at least 10 times a day. What was supposed to transform the internet into a safer, more privacy-preserving place — making this activity conspicuous and consensual via cookie consent banners — is just exposing what an unpleasant place the internet can be.

There are no cookie consent banners on Abzu.ai

There are no cookie consent banners on Abzu.ai because we only use strictly necessary cookies and session cookies. Check out our privacy policy — it’s pretty straightforward.

Our cookie policy is simple: Start with a question, not with data.

Data is universally available. What is scarce is the foresight to synthesize information from your data. You’re constantly being mined and monitored in innumerous domains for innumerable ilk of data, because starting with data is unscientifically easy. Asking the question first, building a model to test the hypothesis: now that’s the hard part.

When data drives operations and decisions.

In a vacuum of critical thinking by controllers — and until recently a vacuum of enforced policies — the equation reversed. The direction of computation should have been: “What data do we need to answer our question?” But in an abundance of raw material (your personal data), controllers floated along a lazy river fed by tributaries of data-collection and allowed the flow of data to define what to do next. Data drove operations and decisions. When controllers let data tell them what to do next, data simply said it wanted more data.

“You wanna talk about stress? You wanna talk about stress?! Okay! I’ve stumbled onto a major company conspiracy, Mac , how ‘bout that for stress?” — Charlie

But controllers forgot to consider that data doesn’t imply correlation. It’s easy to assume or overestimate a link between variables. Continue adding personal information to the equation, and now it’s conclusion-by-conspiracy-wall because dependencies and critical thinking didn’t originate the function. If it sounds like you’re talking crazy or jumping to conclusions, you probably just need more data.

I’ve been that controller: the marketing professional who rejected Occam’s razor and thoughtful planning. Instead I embraced unnecessary complexity and an overabundance of data simply because I could, and because users let me. Friendly interfaces and, frankly, conveniences in consumption journeys obscured the computational misdirection (data first, questions later) and the magnitude of what was going on.

Granted, some really interesting interrelated behaviors surfaced because we are mostly predictable beings for all our impulsive and irrational behavior. But without a hypothesis to solve, or any forethought, there is no promise that you’ll generalize correctly. And there are consequences to allowing companies such intimacy and accessibility into so many aspects of our lives — consequences to playing god with big systems.

Human-centric technology means giving up God mode

We’ve been creating new tools but for the same old urges. We’re still trying to play god, to take and hoard and monitor more than we should before we have divined a direction. Is anyone surprised that our inventions reflect our biases, avarice, and prejudices?

“Likely crash and kill us all.” — Wash

When we talk about making humane tech or human-centered AI, what we really mean is giving up God mode. We mean making decisions thoughtfully and responsibly.

Metering our activities doesn’t mean we throttle possibilities or become narrow-minded! It means that we become more of our best selves: communal rather than individual, explorers over conquerors, hunters instead of gatherers, creative in lieu of one-track-minded.

At Abzu, we’re for lean data and human-centric technology. We’re for self-organization. We’re not for control by external agents, which is why we’ll never ask for anything we don’t need.

We start with questions, and we answer through data, not by ineffectively digging through limitless data sets.

We listen to our users to get a sense of their needs. We conduct personalized interviews. We hand-hold, we white-glove, and we hug because we embrace our humanity. And we firmly believe we aren’t the crazy ones.

--

--