5 Minutes with Laura Norén

“I wish more people knew how exciting and positive data science ethics is.” We catch up with ethical mastermind and postdoctoral fellow, Laura Norén!

NYU Center for Data Science
Center for Data Science
4 min readApr 3, 2018

--

Laura Norén is an Adjunct Professor at NYU and a Moore-Sloan postdoctoral fellow at the NYU Center for Data Science. Her research focuses on the impact of design on social behaviors, particularly within the context of technology. Most recently, her focus has turned to the ethics of data science, and her class on the same topic was covered by the New York Times. She is also the Managing Editor for the weekly Data Science Community newsletter.

1. Your work in data science ethics is becoming increasingly important to the scientific community. What are some key issues that you are excited to focus on this year?

Many professions like business, journalism, and engineering have offered ethical training and codes of ethics for their members. As data science becomes professionalized, there is increasing attention towards teaching courses and developing codes of ethics that address the specific needs of the discipline while drawing on the wisdom of adjacent fields like computer science as well as fields like moral philosophy that are not always considered adjacent. I’m very excited about the way this type of academic cross-pollination can accelerate data science towards a thoughtful, beneficent role in organizations and society.

2. Do you think that what we consider to be ethical — whether ethical behavior or ethical algorithmic design — can change over time? If so, how would this affect the larger goal to make the scientific community a more ethical space?

One of the great lessons we’ve drawn from science and technology studies is that science, like everything else, is context dependent. The cultural contexts in which we imagine and define what kinds of information consumers, users, employees are allowed to keep private from governments, companies, and, yes, even scientists, is dramatically dependent on national context.

We all think we know what privacy means, but in fact, the concept varies quite a bit across national contexts.

Americans tend to be more willing to allow themselves and their information to be tracked, stored, and used to drive decisions about the experiences they have in the world. Continental Europeans are more invested in limiting the reach of institutions into their personal lives and behaviors. The EU’s General Data Privacy Regulation is set to become much stricter in May 2018, and this presents major challenges to the way global companies conduct data science.

Still, the scientific community should not adopt an arrogant attitude towards privacy, and assume that our ability to reproduce results is always more socially beneficial than our research subject’s rights to delete or conceal their data.

3. What are some of the most memorable ethical dilemmas that have come up in the field in the last few years? Have these problems been solved yet — and, if so, what did you think of the outcome?

We are working hard to figure out how we can make as much data publicly available to anyone as possible without revealing the user’s identities, or contributing to outcomes that will not advance the public good. There are exciting efforts to share clinical trial data in medicine, for example, or to share government data with scientists and the public— yet, we still don’t see commercial entities getting into the data sharing movement.

Within companies, the biggest questions are: How much data should be collected? How long it should be stored? And, what types of insights should be drawn from it. Right now, we have a lot to worry about when it comes to predictive policing, but I think too few people are aware that the minority report scenario (where data predict someone will commit a crime before it happens, thereby allowing people to be detained while innocent) is heating up within predictive HR.

One other area of research that is quite novel is around how humans are going to share decision making with semi-intelligent machines. As the machines get more advanced, it will really feel like we are sharing tasks with them, somewhat like the way we now share the task of navigating with GPS-enabled navigation devices. Such tools may allow people to have more confidence to explore new places without worrying they’ll get lost, but it may also atrophy our ability to navigate without that kind of assistance.

Figuring out how to get as many benefits as possible from these shared human and algorithm processes without losing elements of life, discovery, and learning that we care about will be a growing area of research. What happens when our cultural products are recommended by Netflix and Spotify? Do we expand our taste horizons or amplify our starting tastes?

I call these questions the cultural economies of data.

4. What is something that you wish more people knew about data science ethics, and the work that you do?

I wish more people knew how exciting and positive data science ethics is. The class I’m teaching now is the BEST class I’ve ever taught, close the Hollywood ideal of the Ivy League college course where everyone contributes, students are free to challenge each other productively, and are open to changing their minds. We are working on open questions, searching for ethical frameworks we can adapt to data science, and it feels good to contemplate how we can build the most beneficial, least harmful future.

--

--

NYU Center for Data Science
Center for Data Science

Official account of the Center for Data Science at NYU, home of the Undergraduate, Master’s, and Ph.D. programs in Data Science.