Three Ways “Data For Good” Can Make All Data Better

Public Democracy
10 min readOct 19, 2019

--

“Three Ways ‘Data For Good’ Can Make All Data Better” first appeared in abridged form as a featured article on B The Change, B Corporation’s official site to “inform and inspire people who have a passion for using business as a force for good in the world.” Its publication coincided with a joint announcement by Public Democracy and LiveRamp, the nation’s leading data marketplace, that Public Democracy had become LiveRamp’s first B Corp partner.

Most of the data that companies gather on us is being used to get our attention and sell us things. The more companies know about what we’ll look at and buy, the easier it is to accomplish those goals. They’ve found increasingly effective ways to predict such behavior by having AI learn from data profiles about our physical characteristics, income levels, purchasing patterns, and the like.

Public Democracy has taken a different approach to what matters with data. Over the past decade, we’ve supported a wide array of civic, service, political, advocacy, and community building initiatives. Our Values Data™ is a collection of millions of points of connection and meaning, representing the moments when someone committed to join a cause or took time to learn more about something that mattered to them. From sharing a hopeful documentary or an inspiring political ad to serving at a soup kitchen or signing a petition advocating for others, each moment is a reflection of dreams, empathy, and agency.

By compiling those moments, we have learned about more than what people will buy. We’ve gained a deeper understanding of what motivates individuals to seek the common good and what they care most about. When data measures commitment over attention, reflects people’s beliefs over what they’ll react to, and prioritizes what matters over what sells, it provides a much better picture of who people are and what they really value.

Values Data is more effective at digital engagement efforts and targeted marketing, primarily because it encourages the advertiser to meet the user’s needs rather than the other way around. It also teaches totally different lessons about what people care about to the data-driven AI systems that control so much of what we all see and who we connect with.

Every single moment of commitment and contribution paints a much truer picture of who we all are; taken together, this data can teach algorithms to understand humanity in deeper and more meaningful ways, too.

As a Certified B Corp, Public Democracy is on a mission to align what is right with what works. And today, as we bring to market one of the largest datasets ever designed to achieve B Corp principles, we wanted to share a few lessons about how Data For Good can make all data better.

Lesson 1: Origins matter when it comes to data

When I was in college, Mark Zuckerberg was busily writing the first code for FaceMash, a precursor to thefacebook.com. He modeled his initial premise off of HOTorNOT.com. The first version of Facebook (a corporation with over half a trillion dollars in market cap today) was designed to display photos from Harvard’s freshman class directory so users could choose who was more attractive. An early iteration of YouTube was also based off of HOTorNOT.com; now the world watches YouTube over a billion hours a day.

The code that powers the tech giants that have created incredible value for consumers around the world — and especially for their early investors — got started with, well, hormone-fueled pseudo-pranks. They were first designed to drive attention by feeding the impulses and insecurities of those judging others and those who kept checking to see how they were judged.

This is not a good thing, because origins matter when it comes to data. Back in the mid-2000s, most tech founders were asking, “What will people click on and what will keep them coming back?”

That remains their focus today.

Eric Sapp, the President of Public Democracy, intentionally took a different approach. He built Public Democracy’s data, from the very first byte, to discover, “What do people care about and what will they do together?” Trained as a pastor, Eric knew that people are best understood by their values and best engaged through empathy and listening.

Inspiring someone to act — as Eric and his team did on behalf of Oxfam, Bread for the World, and other impactful organizations — starts with understanding how to provide them with something they value. That creates better results, and it yields a different kind of data. Taken together, this data can teach algorithms to know us all in deeper and more meaningful ways, too. When AI systems learn from those moments, Values Data depicts people as more than purchasers, understanding them as more complete (and complex) individuals, not just consumers.

Lesson 2: Better data will train more meaningful algorithms

Early Facebook investor Sean Parker recently said that what originally motivated Facebook was finding ways to “consume as much of your time and conscious attention as possible.” When they released the like button and other similar features, Parker said the goal was “to give you a dopamine hit.” He explained,” that’s going to get you to contribute more content … [It’s] exactly the kind of thing that a hacker like myself would come up with.”

Netflix’s CEO once told investors that he does not consider Hulu or Amazon Prime as their greatest competitors. Instead, he said the real rival is anything elsepeople do other than Netflix to “unwind, hang out, connect.” He went on to brag that “we actually compete with sleep. And we’re winning!”

For Public Democracy, each data point represents more than an attention-getter. Every engagement in our database reflects a point when a person decided that something we were offering them could make a difference and that their participation mattered. We built our database as a way to harness the inherent value in each of those moments — as a repository of intent, hope, and communal engagement by tens of millions of individuals over many years.

Values Data creates more effective digital engagement opportunities today, but it’s far more than that. What we train AI on right now will determine what is possible with tomorrow’s technology. As MIT’s Hala Hanna and Vilas Dhar of the Patrick J. McGovern Foundation have pointed out: “bias in, bias out.”

Most of the data out there right now is teaching AI that what we purchase, or click bait that gets our attention, must define who we are and what we care about. (In fact, most data categorized as “Interest” is entirely a reflection of what people buy). Imagine a child who grows up in a household where the “good life” means consumerism, selfies, and the best burns on social media. That child would develop a very skewed idea of what matters in the world, but that is largely the data environment our machines are learning in currently.

We’ll all be much better off if AI learns more about humanity than what purchasing patterns and attention metrics alone can teach, because these algorithms will control so much of what we see and connect to in the future. But for AI to learn different lessons, we need alternative data input. As Hanna and Dhar rightly warn, “Algorithms are only as good as the data that train them.”

Unlike most other data at work in the marketplace today, Public Democracy’s Values Data will teach AI what people commit to when engaging with more compassionate “parents,” such as Oxfam and Bread. Data focused on commitment, values, service, and community tells a more holistic story of what people care about, improving the data-driven AI systems that control so much of what we all see and whom we connect with. We’ve already used Values Data to train AI-driven behavioral models to help veterans with severe PTSD, develop new ways to identify and engage those struggling with addiction, and improve market intelligence about the needs and priorities of underserved communities.

We think the time is right to bring this data to market, because using technology to connect with people who believe their actions can matter is needed now more than ever.

Lesson 3: Better data and more meaningful algorithms will be more purposeful and profitable

You should watch “The Great Hack” if you haven’t already. This captivating David (Carroll) versus Goliath story is an entertaining, albeit terrifying, docudrama about Cambridge Analytica’s sinister approach to data. The truth is, Facebook wasn’t hacked — America was. Cambridge collected as much data as they could, to profit as much as they could off of people’s data without their knowledge. Their strategy was, in so many ways, the exact opposite of Public Democracy’s approach to business and the impact we create.

Cambridge’s bad-for-the-world approach, as it turned out, was also bad-for-business (causing them to file for bankruptcy while Facebook’s market value dropped by $50 billion). Last quarter, Facebook paid a $5 billion fine, which was the largest civil penalty in a data privacy case in U.S. history. Mark Zuckerberg took out full-page apology ads admitting “a breach of trust.” Ominously, he warned that his platform could still be used “to interfere in elections, spread misinformation, and incite violence.” Since then, first Instagram and now Facebook have started experimenting with abandoning the like button to foster “a less pressurized environment.”

It’s not surprising that data initially built upon the objective of HOTorNOT would ultimately be used to sow fear, isolation, and judgment. A company like Cambridge with no scruples, seeking profit at any cost, followed the path paved by the data sources that already existed in the marketplace — data trained by impulses and insecurities.

Chris Hughes, Zuckerberg’s Harvard roommate and a cofounder of Facebook, recently recalled, “I remember a ton of conversations in which the introduction of our tools was compared to the advent of the hammer, or the light bulb. We could have compared it to a weapon, too, I suppose, but nobody did.”

Because they ignored those inherent risks, we’ve all paid the price for it.

But there’s hope. Data built from empowering people to join together in service isn’t especially effective at sowing division and isolation. But it does have immense potential to give people agency and connect them in more meaningful ways.

Zuckerberg and Hughes are correct that data is a tool, but as I’ve heard Eric say to our team many times:

“Data is a tool, and any tool can be used to help or hurt. But some tools are better suited for one use over another. A rifle and a hoe can both be used to kill someone or to dig a furrow for planting crops. But each is better at — and more likely to be used for — one use than the other.”

That is why over the past several years, Public Democracy’s Values Data has powered award-winning civic engagement around bipartisan policy solutions and trained groundbreaking AI solutions. Those efforts are now attracting some of the biggest corporate players in data, as well as government agencies tasked with tackling our most pressing social challenges, to create new partnerships that will depend on the unique understanding that Public Democracy’s AI systems generate. We are confident that, together, we can chart a better course — and we hope we’ll have some tailwinds to propel us.

Stewarding data in the Silicon Age

Last month, 181 CEOs signed a letter that probably made Milton Friedman roll over in his grave. Going against the tenet that the sole purpose of business is to maximize shareholder profit, they supported the Business Roundtable “modernizing its principles on the role of a corporation,” so that business practices should benefit multiple stakeholders, including employees, customers, local communities, shareholders, and the environment.

This position echoed a similar letter last year from the CEO of BlackRock, the world’s largest asset manager. Larry Fink wrote:

“Without a sense of purpose, no company, either public or private, can achieve its full potential. It will ultimately lose the license to operate from key stakeholders.”

That message resonated loudly, to the tune of $6 trillion under management.

But while the tectonic plates are now moving beneath us, let’s be clear that those shifts are only possible due to hundreds of tremors caused by the B Corporations that have advanced “business as a force for good” since 2006. That’s what Public Democracy stands for.

Our data doesn’t just make us different (which even Cambridge Analytica claimed). At Public Democracy, our data makes a difference. That’s what our company is built to do. As a B Corp, we have a fiduciary responsibility toward all of our stakeholders and greater accountability internally and externally for our actions.

Our profits feed our purpose; our purpose feeds our profits; and the flywheel turns.

Imagine if all early algorithms had been built from Data For Good, with more meaningful knowledge about all of us, rather than being designed to exploit our insecurities and impulses. Perhaps in such an environment, even a profit-at-all-costs company like Cambridge might have been directed down a different path. We’ll never know what might have been, but now we can choose what will be.

We hope other tech companies will join us by finding new ways — through better data, and the more meaningful AI it creates — to empower individuals, improve communities, and advance the common good. They will be right, and smart, to do so. If the Gilded Age has become the Silicon Age, and Standard Oil has been succeeded by Big Data, then it’s our duty as stewards of that data to do right by people as well as earn a profit.

We think that’s just better business, in both senses of the word.

Rob Lalka is chairman of the board of Public Democracy, Inc., and Albert R. Lepage Professor in Business at Tulane University’s A.B. Freeman School of Business, where he serves as Executive Director of the Albert Lepage Center for Entrepreneurship and Innovation.

Read more about Public Democracy’s partnership with LiveRamp and how Values Data can help counter the opioid epidemic in Grit Daily.

--

--

Public Democracy

Values Data™ can change our world. We believe in aligning what is right with what works by understanding data differently.