Men who made machines who want what they decide — Feels like Summer — An AI Ethics Perspective

Ben Gilburt
3 min readOct 11, 2018

--

“Men who made machines who want what they decide” — Childish Gambino, Feels like Summer.

Childish Gambino’s latest song reflects on the endless changes happening in our world, which seem always to be going in the same direction, hotter, faster, more, exhausting our resources and even killing things like the bees which we depend on — All the time trying in vain to slow down, and wishing the same for our children.

Feels like Summer — Childish Gambino / RCA Records

One line in the song stood out to me — Men who made machines who want what they decide.

I see this line in two ways. One, where the word ‘men’ becomes more important, as it is men, but not women who decide what the AI wants. The in the second we’re using ‘men’ to mean people, or humanity, which turns the line into a commentary on AI alignment.

Men

The line is not that machines want what men want, but what men decide. This is a strong reflection of what we see today. The developers and others working on AI and value alignment are predominantly men. I fall into that category. As few as 24% of technical staff at Google are women, and despite efforts to make their workforce more diverse progress is slow. Some of us are mindful of the issue presented, that we’re not a representative slice of society, so our personal values are not fully representative, and creating an AI capable of generating dramatically more value and change than we could alone, would create a strong inequality.

A solution to this may be to make the AI community more diverse, not just in genders but including all areas of society, but achieving this has significant latency. We will need education in certain areas (I appreciate that the education requirement creates exclusions!) and this education often takes many years, and the societal change to make the education happen would likely take decades. A hard take-off for strong AI could happen in a fraction of the time.

The best solution at hand then is for the existing group in the community to do their best to ascertain what the convergent values of society are. Critically the men will still be the gatekeepers, deciding what those convergent values are.

Humanity

The second interpretation, that we’re referencing humankind, not men, is just as interesting.

The line is not ‘Men who made machines to want what they want,’ but for them to want what we ‘decide.’ In the first interpretation it may merely be that we are trying to create more broadly representative values in AI, but in the second humans collectively have made an AI want something different to what they want.

This may be a positive thing. Our values today may in some way be better than they have been in the past — We no longer try to burn innocent people that we brand witches, and society is becoming less violent, but these changes do not seem to be stopping. It’s entirely possible that better is a subjective question, that there is no universally right or wrong thing to do, and our values are just a reflection of our society and its needs at any given time. Making an AI do what we decide, rather than do what we would do helps to future-proof the AI, at least for some time. We may be able to forecast what our values will be at a particular time in the future and make the AI want that.

Thanks, Gambino. For making us think, again.

--

--

Ben Gilburt

I write marginally better than GPT-2 (small model only) @realbengilburt