Member preview

Your Data is Your Property

#OwnYourData

EDIT: The reason I deeply care about #OwnYourData: it’s central to first steps of investing heavily in human improvement. It’s time sensitive that we begin to radically improve ourselves, in every imaginable and unimaginable way, not having companies such as Facebook extract our value and make us worse versions of themselves. Commercial incentives need to align with human improvement. On the other side of radically improving ourselves, may be the answer to all our other problems.

tl;dr

  1. Facebook is not trustworthy; they never have been. Let’s move on to something constructive.
  2. Your data should be your personal Property, like a house or car. You have the Right to own your own value.
  3. We need to take collective, coordinated action to have Facebook update their terms of service to acknowledge that we own our data, control it, can use it to improve ourselves, and profit from it.
  4. Once individuals are granted ownership of what they rightfully own, major market corrections will happen creating The Data Crisis. It is as large, or larger, in size and scope as the 2007 Subprime Mortgage Crisis. The second, third, and fourth order consequences following a shift in data ownership to individuals will scramble economic and political power, concentration of engineers, revenue models, social norms, and incentive structures. EDIT: Facebook’s ~$500B market cap assumes our digital data is “ownerless”, allowing them to collect and monetize without constraint. But, in fact, our digital data is our digital Property. We simply haven’t recognized this yet. There’s directional legal state and federal precedent i.e. medical and financial information.
  5. The economic incentives to improve humans are eroding, quickly. Facebook is exacerbating that problem, using their power to make us the worst versions of ourselves. We need to increase economic incentives to radically improve ourselves, in every imaginable and unimaginable way. On the other side, improving ourselves might be the solution to all of our other problems.

Facebook’s genome decoded reveals psychological manipulation, fake news, Cambridge Analytica, and Russian meddling, to name a few. They are untrustworthy; they always have been. It’s time we move past this realization and into a constructive next step.

What I’m reading, what I like, who I’m friends with, and where I’ll likely be later today should be legally classified as my personal Property, just like a house or car.

We’re just beginning to understand how valuable our data is. Facebook has built a ~$500B business mining and monetizing data from 2.2 billion people. In 2017, each of us generated ~$227 in value for them¹. Google’s market cap is over $700B, most of it built on our data. Twitter is ~$21B and Experian ~$20B. These 21st century mining companies figured out how to make trillions while we were watching cat videos, arguing politics, and seeking validation from others.

In real estate, you buy land and improve its value. With Facebook, we’re like an oil well, where they extract our value.

And for the last decade, they’ve been monetizing our Property to make it rain….

It’s time we wake up.

We need to take specific and collective action, coordinating efforts to have Facebook update their terms of service to recognize that our data is our personal Property. We own it, control it, and should be the ones to profit from it.

Even though figuring out what “my data is my personal Property” means (and how we reduce it to practice and formalize it in terms of service agreements) is a massively complicated topic, it’s better we figure this out in the open market, not send it to Congress where it will get mangled and puppeted.

Following Facebook, let’s have the same conversation with Google, Apple, Twitter, Snapchat, Experian, Amazon, and thousands of other companies.

Which leads us to…the Data Crisis: the major market corrections that will ensue once individuals are granted ownership of what they rightfully own. It is as large, or larger, in size and scope as the 2007 Subprime Mortgage Crisis. The second, third, and fourth order consequences following a shift in data ownership to individuals will scramble economic and political power, concentration of engineers, revenue models, social norms, and incentive structures.

Prior to the 2007 global financial crisis, underwriting models assumed it was not possible to have a prolonged drop in home values. This blind spot led to excess leverage and caused extreme pressure on the very same assets which were thought to be infallible. There are significant parallels in the market today, where your personal data is the home value that will always be owned by the large data miners. Should you be able to take back ownership of your personal data, Facebook’s revenue potential will be significantly handicapped. The same will be true for Google, Twitter, Snapchat, Experian, Acxiom, and even Amazon.

The (Economic) Value and Mining of You

In monetary terms, how much you are worth? The traditional answer is your “net worth”, assets minus liabilities. Beyond just what you think you own today, what if your data was added to your balance sheet? Specifically, what if you owned and were in control of:

  1. Everything about you.
  2. Everything predictable about you.

1. Everything about you.

This includes what time we wake up, how we slept, what kind of coffee we drink, what deodorant we use, clothes we wear, how we drive, writing style and vocabulary, computers and smartphones we use, internet activity, how we move our mouse, what we eat, how our body metabolizes that food, our genome and microbiome — you get the idea, everything.

Facebook and other human data miners are in an arms race, scrambling to collect the most and highest quality data about you. And, the cat is already out of the bag — enormous amounts of data about you are already floating around, sitting as treasure chests of gold that others, not you, are seeking to capture and gain from.

Data is captured by our smartphones, credit cards, voice assistants, video surveillance, social media, email, messaging, reading devices, gaming systems, entertainment providers, fitness trackers², biosensors, and dozens of other technological mediums. We’re on a path to having our entire existence captured, sometimes on a microsecond scale and at a depth that has never been possible before.

The value of this data changes over time, which is good news for us if we own our data. If we imagine a seemingly innocent piece of data — 5 minutes of a voice recording from 1988. What could we determine from this data? The language spoken, approximate age, educational level, personality, and gender among other things. Interesting, but not all that powerful. Today, the same data can be used to diagnose all those same things (but orders of magnitude better) and diagnose a person’s mental health. The data hasn’t changed, the algorithms just keep getting better.

The algorithms will continue to get unimaginably better on previously-existing data sets. What will the algorithms of tomorrow be able to tell about that voice recording from 1988? Whether you were telling the truth? Your personality broken down into 116 classifiers?

As we barrel forward into the future with little to zero thought, we now have voice assistant devices in our homes, listening to EVERYTHING we say. Voice is an incredibly rich data set, from which to extract insight and create predictive models.

Alexa or Siri are likely listening to some college kid somewhere but that kid will one day be the President of a country. If that doesn’t give you pause, I’d invite you to reassess through a new mental frame: expect that no matter what Amazon or Google tells you, EVERYTHING is indeed being collected, and EVERYTHING you say can and will be used against you at the worst possible moment in your life. Even if the company did no wrong, and the data was hacked. Any other assumption would be naive.

2. Everything predictable about you.

After collecting EVERYTHING they can about you, Facebook and others sell your future needs, wants, and decisions to the highest bidder.

Facebook does not care if my 14 year old goes to bed on time, does well in school, or is creating productive life habits. They do the exact opposite, systematically manipulating my 14 year old based upon his predictive behavior to make more money at any cost. As a parent, this is infuriating.

What can be predicted with the data collected on us? What could Apple predict after combining the data from the accelerometer, GPS, our pictures, emails, phone calls, text messages, app usage, voice recording, usage habits, heart rate, and exercise regime? Many adherents to religion are used to being watched all the time by an omniscient God. Do we want our technology to enable the same opportunity for governments, businesses, criminals and our fellow humans?

When Our Thoughts are Known

The right to own our own value has been weighing heavily on my mind lately because, as an entrepreneur, I’m building brain interfaces to read and write neural code. So is Facebook. If brain interfaces came online in our current environment, I’d be terrified.

Daily, I imagine the potential consequences of the tools my team and I are trying to create. A sad truth is that when a person, business, group, or government has information on someone else, they can be reliably expected to use that information to their advantage, and at the other person’s expense. The conflict is usually easily rationalized away (our brains are superb at that). We try to constrain such behavior with laws, regulation, and social norms, but exploitation still happens across social structures large and small. This is manifested in petty or harmful gossip, business strategies, and political campaigns. As much as I’d like to be positive, humans simply cannot be trusted. Exhibit A? All of history.

If men were angels, no government would be necessary
-James Madison

What’s harder to imagine and predict are the potential second and third order consequences of new tech in society. What are the many things that will unintentionally go wrong? I’ve wondered how deeply Steve Jobs thought about the intended and unintended implications his smartphone would have on the world. Did he foresee the uniquely powerful addictive tools his smartphone was giving psychology hackers? Did he contemplate other known and unknown negative externalities? Was it his duty to at least try? If he did see them and wanted to do something about it that lessened shareholder value, would he have been ousted?

Since their founding in 2004, Facebook has methodically exploited our psychological tendencies, aggressively pushing the boundaries around what we will and won’t give away about ourselves and in the process, mined our economic value. In doing so, they became a leading architect of our personal and societal privacy operating system. We cannot just walk into the future and let data mining companies like them take advantage of our passivity when it comes to our value.

How did they architect their mining operation? Let’s hear from two former Facebook execs and then Zuckerberg himself:

Sean Parker, Facebook co-founder and President during the early years reminisced in November of 2017:

“It’s me, it’s Mark [Zuckerberg], it’s Kevin Systrom of Instagram, it’s all of these people, we understood this consciously, and we did it anyway…that thought process was all about how do we consume as much of your time, and conscious attention as possible. That means that we need to give you a little dopamine hit every once in a while because someone liked or commented on a photo, or a post, or whatever. That’s going to get you to contribute more content, and that’s going to get you more likes, and comments. It’s a social validation feedback loop…you’re exploiting a vulnerability in human psychology.

Additionally, Chamath Palihapitiya, a Facebook user growth executive from 2005 to 2011 said in two different media appearances, on November 10th 2017 at Stanford Business School and on December 12th, 2017 on CNBC Squawk Box:

“We did that brilliantly at Facebook…exploiting psychology of mass populations of people…I feel tremendous guilt. I think we all knew in the back of our minds, even though we feigned this whole line of there probably aren’t any really bad, unintended consequences…we are destroying how society works…No civil discourse, no cooperation, mis-information, mis-truth…”

Chamath’s advice going forward: “If you feed the beast, that beast will destroy you. If you push back on it, we have a chance to control it and rein it in. I just don’t use these tools anymore…”

When news of Fake News on Facebook surfaced on the potential Russian meddling in the U.S. Presidential election, Mark Zuckerberg, with an opportunity to contemplate what intended and unintended consequences his company may have on the world, responded that such assertions were “crazy”. Was Zuckerberg’s comment a strategic PR move (that in the end backfired)? Or was it something worse: a lack of awareness, introspection, and openness of how his creation could potentially affect the world?

It’s worth checking out Facebook on Russian Propaganda: from zero to 150,000,000.

VR, AR, and neural interfaces are coming next.

The Economics of Human Relevance & Survival

The Data Crisis resulting in our data becoming our personal property is critically important to the future of the human race in that it highlights the most important economic metric in the world: the return on investment (ROI) of human ability.

With autonomous cars, the objective is to not make Uber drivers better — it’s to replace them. With radiology image recognition software, the objective is not to make the radiologists better — it’s to get rid of them.

When Facebook builds digital models of you to sell to advertisers, the purpose is not to make you better, it’s to maximully consume your attention so they can make more money.

The co-evolution of human and artificial intelligence is currently underway, and it is the most consequential development in our known universe. Whether we can solve the problems that threaten our existence, whether we can be relevant in the age of rapidly accelerating digital intelligence, and whether we can build a future we care to live in all depends upon our ability to co-evolve with AI. From an economics perspective, the value of AI is that it is cheap, efficient, and can scale and improve incredibly fast. Conversely, it takes 33 years to generate a single PhD.

The hard realities are that digital intelligence abilities have several key advantages over humans in our current economic system, creating an insatiable appetite for capital and investment in these technologies. As a result, the difference in digital and human speed of development and investment returns gives me pause.

In short: The economic incentives to improve humans are lessening — and quickly.

Facebook is a primary contributor to this by treating us like oil wells, not real estate to be improved.

Yes, humans are remarkable. Our accomplishments speak for themselves. We are still the most formidable general intelligence in the known universe, and data suggests the world is getting better in many ways.

However, over a long enough time horizon, in a world driven by return on investment, it’s possible that we have designed the perfect system to march ourselves into irrelevance as the compounded rate of returns on digital intelligence will make humans an increasingly unworthy investment.

We need to incentivize investment in human improvement by making us real estate to invest in, not the oil well that’s mined. By owning our own value (data), companies can be incentivized to make money when we improve, instead of making us the worst possible versions of ourselves. Having data as our Property could lead to lowering the costs to human improvement while increasing investor returns — encouraging even more capital investment into direct human improvement.

Subscribe to be in touch!

Sitting on the other side of our own improvement may just be the answer to all our other problems.

Join me and #OwnYourData

Facebook Profile Image
Facebook Cover Image
Instagram Profile Image

Footnotes

[1] $500B/2.2B users

[2] Just recently Strava data revealed the location of US military bases. Strava didn’t imagine their anonymized data being used this way — guess what, if anonymized neural data is suddenly made available, who knows what a clever algorithm will discover?

[3] Malware covertly jumps air gaps using built-in mics and speakers.