As technology advances, how do we avoid losing touch with our values?

Image: REUTERS/David W Cerny

Jennifer Kuzma, Distinguished Professor in Social Sciences, North Carolina State University

Could we grow corn without using pesticides? Is it possible to wipe out disease with terminator mosquitos? The technology already exists, but what are the ethical consequences of deploying it? Jennifer Kuzma, Distinguished Professor in Social Sciences at North Carolina State University and a member of the Global Future Council on Technology, Values and Policy, explains why we need to approach technology development with our core values in mind, and how we draft policy to reflect those values.

You are a member of the Global Future Council on Technology, Values and Policy. But what does looking at those three concepts together entail?

When we make decisions about technology, we like to think it’s all about the science, but there is always some level of values involved, and how we address the ethical questions will have an effect on how we draft policy and regulations.

Sometimes technologies are made for a particular purpose but they don’t necessarily fit the societal context or they have long term impacts that we cannot foresee at the time of deployment. How can we be responsive and develop technologies that help address societal needs taking into account the values of the communities in which technologies are embedded?

What would be an example of where that has happened?

There exists a certain type of genetically engineered corn which uses less pesticide while growing it. That works in the USA for large-scale commercial farming, where it was developed. However, in certain regions in Mexico, indigenous groups have developed their own varieties of maize over thousands of years. Those varieties are already adapted to the micro-environments in which they live and these varieties have important cultural significance.

So when a company comes in and tries to sell genetically engineered seed, it’s not the variety that these people want or need. Companies want to deploy a technology but often not in a context that really fits. So what ends up happening is that maybe farmers will grow a little of the corporate corn for export, but they will keep growing their own varieties for local consumption.

Old models of development are often seen as people from developed countries going in and trying to fix problems in lesser developed countries. There are a lot of ethical as well as technical problems with that approach. Now we are seeing more efforts where people work in partnership with local experts and practitioners to take into account the needs of those in the area.

What are some of the key trends that we should be aware of?

Bottom-up approaches to technology development are occurring. I see this becoming more common. These efforts are empowering citizens by giving them a voice in technology development.

Right now mosquitos are being developed which could help fight diseases like Zika and Malaria. For example, there is a genetically altered mosquito that when it mates with a wild one, the larva don’t survive. Eventually, the number of disease carrying mosquitos will go down in the environment. In some programmes, citizen engagement has been given priority, prior to deploying these engineered mosquitos in the environment. In the United States, people in the region of the Florida Keys will be voting on whether these mosquitos should be released. In other countries, technology developers have been engaging citizens in discussion about the engineered mosquitos, asking for their opinions and input.

That’s in contrast to other projects in the past where the mosquitos have been deployed without citizens in the area even knowing about it.

I do see some very positive trends towards the embedded, bottom-up approach, that engages people in making decisions about emerging technologies so that their values are at least considered.

Image: REUTERS/Paulo Whitaker

What is your Council contributing to the global conversation?

There are several reports and academic papers that examine these issues. We could compile various principles that have been proposed already, add to these if necessary, and draw a general framework for responsible technology development that could fit a variety of global or regional situations.

So I would like to see us developing principles, guidelines and best practices. You might not be able to predict the risk or all the impacts from emerging technologies, but you can at least follow a governance process that solicits and takes into account people’s values to increase legitimacy.

What other kinds of technology could be affected by these guidelines?

All kinds, really. Social robotics is a really important area. How far do we really want to go with Artificial Intelligence, especially as it comes closer to possibly replacing humans in the workplace.

Nanotech is also a big question: deploying machines you can’t even see has its own ethical questions to be answered.

There are lots of other places as well: human enhancement, use of drones, the control and safety of the internet. Even global energy solutions and climate change will have value questions as we find new technological solutions.

We are at an inflection point where these technologies are converging and we are able to fundamentally alter human beings, social systems, and the ecosystem.

Values are defined very differently throughout the world. How does that affect the process?

I think that’s one of the hesitations that people have about explicitly integrating values into technology. We tend to make decisions on utilitarian grounds. Do benefits outweigh the risks? We tend to think of that as very straightforward and universal, but it’s not.

You can’t avoid values and we try to pretend they aren’t there in risk/benefit analysis. Regardless, somehow people are more comfortable with that than saying that they have a cultural value and understanding its place in the decision making.

Where do you think values in technology will be by 2030?

Hopefully, we’ll have more societal conversations about where we want to invest our resources into technology development to solve global problems, and the results of those conversations will have input into decision making.

I’m hopeful that we’re going to get better at not just imposing technologies on people or cultures. We’re going to get better at engaging communities and empowering them to make decisions. Right now the power is pretty concentrated but I see that changing.

Have you read?

Originally published at

One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.