Building frameworks, setting standards for ethical data use: Our conversation with Natalie Evans Harris

Katherine Johnsen
fwd50
Published in
6 min readJan 28, 2019

Natalie Evans Harris has dedicated nearly 20 years to advancing the public sector’s responsible use of data. She was the Obama Administration’s Senior Policy Advisor to the US Chief Technology Officer, and led an analytics development centre for the National Security Agency.

Today Natalie is the Co-Founder and Chief Operating Officer of BrightHive, a technology company that develops smart data collection, integration and governance products.

Natalie on the FWD50 stage last November.

Leading up to her sessions at FWD50 2018, we sat down with Natalie for a conversation about the process of setting standards for ethical data use.

How does your background in government inform your community-focused approach?

I grew up in an environment where data had to be collected and used to drive decision making.

The culture that I grew up in was concerned with how to be responsible with that data. When I had the opportunity to exercise my sociology degree by going over to the White House, I got to see the other side: Why there are so many challenges to using data effectively and responsibly.

It happened to be during the Obama administration, when data became a really big priority. How do we get the government to use data to drive decision making? At the same time, we were starting to see an increase in the number of breaches happening across agencies, across major private sector organizations, healthcare companies, you name it.

This raised the question: What are the implications of using data to drive decision making? What are the implications of using people’s personal information and not having the capacity to be able to protect it? It’s not just about protecting data, but about how to protect people and the services that government is delivering to them.

How can we make sure that data is being used and shared for the betterment of society?

What’s really exciting, especially with the adoption of GDPR, is that the conversation is bigger than technology. Responsible data use is not about finding the right tool. Responsible data use is about the culture that needs to be put in place to ensure that there’s trust between people and government. How do you protect that trust and do your due diligence with that trust?

Responsible data use isn’t about finding the right tool. It’s about the culture that needs to be put into place.

We’re starting to see more conversations when it comes to data ethics around that trust-building, the culture and processes and policies that need to be put in place. Before this point, ethics was always a privacy and security conversation. It was always, “if we wanted to protect data, then we siloed it off.” We put it over in this database or repository and it should never touch anything else.

We’re now recognizing that data ethics and responsible data use is not just about siloing data off and minimizing its value. It’s about being responsible and transparent about how people’s data is being used, then putting policies in place to show that you’re being intentional in protecting that data.

Right. A lot of that is policy up to the individual organization.

You’ll see right now with GDPR, there are some organizations that are taking it not for what it explicitly says, but for what its intentions are. We’re starting to see more businesses put out policies that show they not only value putting out a key service, but they also value how that service is delivered.

You’re starting to see more manifestos come out, not just for what they’re doing, but with how they’re using data. You’re also seeing companies on the other side that are reading the GDPR and treating it as merely a legal exercise.

What I’m interested in seeing in the future is: Will the different approaches actually start to effect bottom lines for companies? This is where people have a voice in the conversation. If you don’t like the way a company runs their business, you can walk away. You can put your money toward someone who is being responsible and ethically-minded and using your data in the right way.

How can people educate themselves about what it means to share their data?

As individuals we need to educate ourselves as to what our data is and what it means for it to be shared and used. The answer isn’t that data shouldn’t be used. The answer is, “how do you make sure that both parties understand what’s happening?”

It’s really not that different from the relationship between a doctor and their patient. I trust the doctor because I know the education that they’ve had and I know that they’re held to a standard of “do no harm”. There are repercussions for them if they do harm.

This doesn’t exist with personal information yet. When I talk to companies, I ask, “it’s great if you have a manifesto, but if somebody violates the terms, what happens? How are you making sure that those principles are adopted and ingrained in the way you collect data, in the way that you design your product, the way that you share and use data?”

There’s an accountability part here that needs to be addressed before we can truly start to be ethically minded and responsible in the use of data. There are two sides to it. It’s a relationship.

Do you have any examples of how organizations are leading the cultural shift to ethical data use?

I joined with Bloomberg and Data for Democracy to start an initiative that crowdsourced and built a community based on defining, agreeing to, and evolving principles for responsible data use. We brought on about 100 volunteer data scientists to lead working groups that helped to define a set of principles designed by the community, for the community. It grew to about 800 data scientists in the span of four months.

Now it exists as a formal body within Data for Democracy. It defined these principles and has been able to look at what tools are available to help data scientists apply these principles.

There was one group that used the principles to produce an ethical design checklist. That’s amazing. Now data scientists have a checklist that they can use as they go through the process of collecting data, processing data, analyzing data, creating a tool based on that data. They have a process that helps them to be ethical and thoughtful.

Do you foresee an ethical data standard, such as an ISO, that organizations can acquire to show that they’ve followed the guidelines?

That’s exactly what I’m talking about. How do we get to the point where we can produce ethical frameworks? How can we get to the point where we can produce standards that can be agreed to and adopted on an international level?

GDPR is the first step toward that. There’s a lot of work that still needs to be done, but organizations such as the Institute of Electrical and Electronics Engineers (IEEE) and a body out of the United Nations have started to look at this as well. We’re going to start seeing more movement toward creating these standards, probably in the next three or four years.

This conversation wouldn’t have happened three or four years ago because the private sector was completely opposed to any type of regulations, standards, anything that might control their wild west. Now they’re starting to see that the train is coming. They can try to be a part of defining where that train is going, or they can get run over by it.

FWD50 returns November 2019.

--

--

Katherine Johnsen
fwd50
Editor for

Director of Partnerships + Marketing for @startupfest and @fwd50. Sneakerhead. Carry-on queen. Catch me on the gram @beachykj.