Ethical Deep Dive in the Dutch Senate

Ethics require serenity

Ronald Mulder
Mar 15, 2019 · 14 min read

By Krijn Soeteman (@krijnsoeteman) and Ronald Mulder (@ronaldmulder)

The atmosphere in the Senate is different from that in a normal debate room. All those present feel this and behave slightly different than normal. Wearing an ironed shirt or a jumper instead of a normal t-shirt or a hoodie. People whispering instead of talking out loud and an audience watching us from the old paintings on the ceiling from times long gone by.

Image for post
Image for post
Image: Rutger van Zuidam via Flickr

There is clearly something else going on here than an ordinary tech conference. This is the ethical deep dive in preparation for the Odyssey Hackathon in Groningen which starts mid April. The participants on this event on Friday March 8 are going to talk about ethics. Moral principles that control or influence a person’s behaviour, according to the Oxford Dictionary. Ethics, all connected with beliefs and principles about what is right and wrong.

Our host for the afternoon is mister Lykle de Vries, a well known advocate of blockchain sanity. After pointing out the grandeur of the room, he asks all participants if they sit next to someone they know. If so, they have to move over and look for someone they don’t know in order to broaden everyone’s vision.


The main takeaways from the deep dive can be found at the end of this article.

Ethical compass

“We are gathered here to calibrate our ethical compass”, Lykle tells us. “We hope people go home with great questions which can guide you. Consider implications of your work. We are not here for debate, we’re here for dialogue.”

The first speaker is a representative from the Senate, miss Pia Lokin. She takes us briefly past the history of the location and the Senate. The room itself is the oldest parliamentary meeting room in the European Union and the first to use iPads. The old ink jars stand unemployed on the benches in memory of the past, the lids glued close to the jars.

Lokins lecture takes us briefly past many important happenings related to the Dutch political systems, like the impact of Thorbecke, a 19th century politician who thwarted anarchy and revolution by demanding more influence of the people in the constitution. The Dutch king William the 2nd was smart enough to meet the demands.

Enough on Dutch history from a state ran by nobility to some form of democracy. Lokin brings us back to our times and talks about the importance of internet access for everyone and speaks about some huge internet companies which are too powerful and use their power to make addicts out of us all.

Founding father

Following Lokin’s speech, Lykle introduces Rutger van Zuidam, founding father of the Odyssey Hackathon and calls him ‘possibly the Thorbecke of this era’, even though he adds quickly he’s ‘just joking’. Kidding or not, it shows we’re looking for a bigger story for the future, one which does not dwell in the past.

Rutger explains where the idea for adding an ethical deep dive to the hackathon preparations emerged from. One important eye opener for Rutger was the appearance of our next guest on Dutch television in a show called Summer guests (Zomergasten): Marleen Stikker. She planted a ‘big flag on ethics’ in Rutgers mind.

After Marleen, we’ll listen to Aron van Ammers about running an enterprise. Ethically. Alexander Rinnooy Kan, former member of the Senate, will close the afternoon.

Rutger hopes to create a new story with the upcoming hackathon. “The internet gave us a world with a multitude of truths”, he sais. “The fabric of our society and how to include biosphere, include relations and include identity. At Odyssey we are creating that new story. No one tells us what to do, we cook it up and should have an open discussion about it.”

Marleen Stikker

Without further ado, Marleen takes the stage. The rules of the day are simple: seven minutes to talk, some time for reflection for the audience, questions and a reply by Marleen. Then a second round of contemplation, followed by questions and a final seven minutes.

Image for post
Image for post
Image: Robèrt Guérain

The most important takeaway of the day is: humans are biased and technology is man-made, therefore technology is biased. All technology is a cultural artefact, reflecting values and attitudes. Marleen emphasises this by telling the public she would not have been allowed in the Senate 100 years ago because she’s a woman.

Technology is created by humans, is biased, is not neutral and ultimately technology is culture. “We are all cultural workers”.

“What are we optimising for?” asks Marleen. As an example she gives digital identity for we humans are the external agent for a system. We optimise the system and we have to behave according to the system. The moment you identify to optimise for a system, you already choose a certain bias. You choose to optimise for the owner of the system

The question is not: can we optimise for the future of humanity, but what are my presumptions?

This brings us to the biggest challenge, namely how we teach technology. “People think you have to be very good at math to be a beta, so those who are not that good at math, write policies. Those two seem like seperate worlds. Both should be more humble and share knowledge.”

The bottom line is that when you don’t understand a thing about technology, you don’t understand our society anymore. People who are not techies cannot be naive about tech anymore. Marleen ends her first seven minutes wit a plea for more inclusiveness in the space: “The more inclusive the space is, the better”, which sounds more than reasonable, definitely on International Women’s Day.


After a minute of contemplation, the audience is invited to ask questions at the interrupt microphone in the middle of the room. The microphone has an interesting feature: you have to keep a button pressed to make the mic work. This results in short and well formulated questions.

The first question Marleen answers is the final question: ‘what should we do when people cannot be eloquent about technology?’ She adds a question, which is: what are we optimising for? “We optimise for the people, we should be able to trust the creators, as with food in the supermarket”, Marleen says. “We can be sure the chicken is fresh. That’s because of rule of law. There is no rule of law for tech. Not every citizen needs to be fully tech literate, but they should be able to trust the law and that tech companies abide the law. We should put dignity back in technology.”

Another question addresses how we quantify our databases and how we recognise equivalence between each other. Marleen sums up the four types of sexuality we define these days: heterosexual, homosexual, bisexual and transgender. Then she states it’s easy to make up 20 definitions instead of four. This shows the amount of power the person who builds the database actually has and it shows it’s not a technical debate, but a cultural one.

‘Should ethics be in everything we do?’ can only have one answer: it should be embedded in the process. The principle of ‘privacy by design’.

Next up the huge regional differences on our little blue planet. “You cannot take cultural differences out of technology, as tech is inherently a social construct. China has a social justice system where what the state says is good. In Silicon Valley technology is god and in the EU we constitute ourselves by society. A huge challenge for Europe to come up with a next generation internet which is more based on the commons principle.”

‘Should machines have rights?’ For Marleen it’s very clear: no. She does not believe in the narrative where machines are smarter than us and robots take our jobs. “It’s humans using robots. It’s companies using these robots taking our jobs. The moment you see this, the whole narrative goes away and is only there to scare us. Therefore machines should not have rights. Animals should have rights.”

Marleen sees a clear narrative: humans are messy. Technology is not messy. “It enriches us when we believe technology does not solve all our problems. Problems don’t go away when we think we solve something with technology, we add problems.” She quickly states she’s not against technology, but critical about it and loves technology at the same time.

‘Do we need a deeper understanding of technology’s influence on society?’ Marleen sees many positive signs in society because of things which happened with for example Cambridge Analytics. It opened up the views of politicians and many more groups in society. It showed clearly the socials aren’t as nice as they appeared to be in the beginning.

Marleen’s replies clearly stirred things up within the audience, as a queue formed quickly for the second round of questions for her. As many of those questions build upon questions and remarks from the first round, it’s easier to sum it up.

Equality makes for better technology

One of the questions points out that it’s possible that men are generally better at certain types of work than women. This remark goes against Marleen’s ideas. This is also a cultural phenomenon. She switches to identity as one of the most difficult subjects and Marleen thinks everyone has many more sides, like men having a feminine side.

Identity is more than just a type of gender or colour of your skin. It’s also the fact that you are a person. “Sovereign identity is a question of who defines you as a person. I don’t need a state to tell me I’m a person”, says Marleen.

“The ruling parties in this country [The Netherlands] are very white and male. This is not okay for anyone. But it all starts with education.” Marleen says.

Education is also an important part of the centralised world of (computer) technology. Marleen compares it to a decentralised blockchain which is much more centralised than we might want to believe. Not because of the amount of nodes, but because the technique itself is very much centralised to those who know how it works. One solution to this problem is de-mystifying technology by starting to use more common terminology.

For the second time she adds that technology in itself is biased, but that’s not necessarily a bad thing, but you must be aware.

Aron van Ammers

“Founding a company is a venture, but the mythical, ethical venture, does it exist?” Aron starts his talk after the coffee break. “Most don’t start from the ethical point to set out the journey”, he continues. “In general people start a company because they see a business opportunity and it’s easy to see what is not ethical, like Uber, Facebook and the likes.”

He stresses the ‘sharing economy’ turned out not to be there, at least not within Uber and AirBnB.

Image for post
Image for post
Image: Robèrt Guérain

Whether or not you decide to use artificial intelligence in your company or within your business, you should always be aware of the biases of this type of software. Aron gives an example of the racist and sexist AI because of the data which was used to feed its algorithms.

He jumps quickly from subject to subject, touching lightly on many different problems and solutions. From blockchain as a supply chain solution, to ransomware hijacking your files, showing us you can use the same idea for good and evil.

A venture to me is a way to do something in a bigger way than what only one human can do. If done well, the business model is sustainable, a business can grow. The question you should ask is: have all founders of such a business started out with ethics in mind? Probably not. Eventually ethics should be reflected in a business.

His final part talks about digital autonomous organisations or DAO’s, crypto-networks only exist because of their software. Those types of organisations can be there for good and evil. How can we make sure those systems are ethical? He doesn’t really answer that question, but invites everyone to think about it.

Question time

Aron will answer the questions using the different themes which formed during the round of questions. As we talk about companies, he uses his own company Outlier Ventures as an example. He says for example that they never invested in unprofitable businesses and he thinks something can very well be ethical and profitable, however ‘challenging to get there’.

The follow up on that question is why would you choose a company over an NGO. Aron thinks NGOs are a great way to achieve things, but he thinks a business can sometimes get more things done. He gives an example of a company in the carbon offsetting world which makes it ‘fun’ to support the climate.

You should know well who you bring into your company and if someone agrees with your view of what the business should be about, which brings Aron to the question if we should bring more women in tech. Aron: “My answer is yes! We all have different voices and more voices from different backgrounds, also geographical ones, will lead to more well informed work.”

Scaling ethics. “How can you make sure it still does a good thing when you grow as big as Google or Facebook. ‘Google said initially don’t be evil’ This is about ethics and scale.” Aron doesn’t think Google is entirely ‘good’ or ‘evil’ as it does a lot of things which are not evil.

“Be critical on business models, as most internet businesses right now sell your data. How do we want to design these systems?” Aron says he doesn’t have the answers to these questions, but we should certainly think about it.

He closes his reply with the Trias Politica in networks and there is lack of it as the people who build the networks effectively rule.

The second set of questions starts with a challenge, namely ‘how to be a world wide business?’ Or, to narrow it down: ‘Is it unethical for Google to censor its search engine in China? What is ethical here, what is ethical now? What is ethical in the future?

Social capital should not become financial capital

One of the questions is on tokenizing social value. Aron thinks that’s not the way to go. You might even end up to pay for a date. But it also shows there’s a clear path people seem to like, monetize everything.

When social capital becomes financial capital, people tend to become less creative. Aron gives an example of someone drawing something very beautiful on a piece of paper and afterwards someone gets money for it, the next time it’s going to be less creative.

The bad thing is right now we have surveillance capitalism. What we create today will have downsides in the future we don’t know about. Ultimately an ethical venture is a continuous practice, concludes Aron.

Closing words

After a brief moment in which the audience could come up to the microphone again to share interesting books, documentaries or films, Alexander Rinnooy Kan came up to the lectern for a final calibration of our ethical compass.

Image for post
Image for post
Image: Rutger van Zuidam via Flickr

Being responsible for an age drop of ‘at least thirty years’ compared to the regular audience in the room, Alexander tells us there’s unfortunately not much space for reflection in this room on thoughts and ideas even though it’s often called a ‘chambre de réflèction’, for which we are a welcome change of scenery.

“We are in a period of rapid unpredictable transition. The one concern we all have is whether or not the rate of change our democratic systems are able to accomplish can catch up with the rate of change technology confronts us with. We need help with that. Your help!”

“When we look back at the previous industrial revolution when it started to reach the Netherlands around 1850 when William the 2nd was in power. When there is a revolution, there is a big shift in power. All this change creates new powerful and powerless people which is exactly what happened in the late 19th century. The powerful are powerful because they could adapt quickly to the changes. What we should do is look at the powerful and make sure they are properly encouraged and on the other hand properly protect the powerless. After that both categories will organise themselves. The powerful become employers and the powerless form trade unions.”

After those heavy words, Alexander states capitalism is changing once again and we should take away some learnings from it. On the one hand you had the enlightened innovators and on the other hand the visionary protectors. Both sides should try to find a long term goal.

“That coalition has to happen today where you are the enlightened innovators”, says Alexander.

What can you do to become an enlightened innovator? Read books, converse, but also: step back occasionally from the ‘rush of innovation’ and wonder if a product is not harmful, if it does not violate laws, and how confident can we be that the power rest in the right hands?

“You have to answer those questions with your team and I would like to suggest the team you work with can gain both relevance and quality if you open them to a larger diversity of people than is generally found. Technology is much too important to be left to technologists”, sais Alexander. “A diversity of opinions can help us to accomplish more.”

He finalized his talk with: “Ethics is a conversation, not about everything, but about uncomfortable questions. The more uncomfortable you feel thinking about them, the more necessary it is to think about them.”


Why are ethics relevant, and even important, for techies?

  • Humans are biased. Technology is man-made. Therefore technology is biased. Tech is not neutral. Technology, e.g. blockchain, is a cultural artefact. It reflects values and attitudes, e.g. towards concepts like “trust” or “identity”, that are not universal or neutral.
  • All humans are social cultural beings, even techies.
  • Technology changes power structures. There will be “newly powerful” and “newly powerless”, and new forms of inequality. (And this, by the way, is cause for government intervention, because — in Europe at least — we accept that the state has a responsibility to protect the powerless.)
  • Ethical behavior can not be delegated to machines. It always begins with human choices. What are we optimizing for? What data do we use to train the AI? What are our biases?
  • Technological innovation will change (or reinforce) power structures and other social relations, both intentionally and unintended. This gives developers a responsibility to at least think this through.
  • In early stages of new technologies, the ethical responsibility lies almost exclusively with the development teams, because they are the only ones who understand the tech. In later stages, users can rely on laws and institutions and be “naive” again.

What are ethics?

  • It is not about black or white, good or evil. It is not absolute. Ethics are “in the eye of the beholder”, it is cultural and contextual.
  • Questions, not answers, are important. Ethics are a conversation. It is about asking the uncomfortable questions, like:
  • what are we optimizing for?
  • who defines success?
  • what are our assumptions and what cultural biases do they contain?
  • Ethics are not an add-on, they should be at the core. A venture is about creating something that you want to see in the world, at scale. So you better make sure you really want to see it, and its consequences.
  • Ethics are everywhere; all decisions have ethical dimensions. The way one defines a database (m/f, m/f/n, m/f/other) defines how people can see themselves. Again, it is not about good or bad, it is just important to be aware of this.
  • Because ethics are cultural, more diverse teams have more interesting conversations. Bringing people from diverse backgrounds into your team makes it a lot easier to spot biases, unproven assumptions and unintended consequences.

Inspiration / further reading and watching

  • Saifedean Ammous — The Bitcoin Standard (book)
  • Jaron Lanier — Who Owns the Future (book)
  • Ideo — Listening for Unintended Consequences (Medium post)
  • William MacAskill — Doing Good Better (book)
  • Ruth Chang — How to Make Hard Choices (TED Talk)
  • James Glattfelder — Who Controls the World (TED Talk)
  • Eleanor Ostrom — Governing the Commons (book)
  • Murray Rothbard — What Has Government Done to Our Money (book)
  • Tom Palmer — The Morality of Capitalism (book)
  • Stephen J. Dubner and Steven Levitt — Freakonomics (book, and also a podcast)
  • Friedrich Nietzsche — Beyond Good and Evil (book)
  • Kate Raworth — Doughnut Economics (book)
  • Benjamin Bratton — The Stack (book)
  • Yann Arthus-Bertrand — Home (documentary)
  • Ray Anderson — The Business Logic of Sustainability (TED Talk)
  • Adam Curtis — The Century of the Self (documentary)


We have tried to capture the event to the best of our abilities. However, even we have our biases, and these are inevitability reflected in interpretations and omissions. Luckily, there is also a video registration:

Welcome to a place where words matter. On Medium, smart voices and original ideas take center stage - with no ads in sight. Watch
Follow all the topics you care about, and we’ll deliver the best stories for you to your homepage and inbox. Explore
Get unlimited access to the best stories on Medium — and support writers while you’re at it. Just $5/month. Upgrade

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store