MAMI Lectures, Part 2

In our last installment, we explored two conjoined activities: generating with neural nets and investigating their inner states. The lectures featured below introduce further modes of understanding machine learning.

Rebecca Fiebrink of Goldsmith’s treated us to some Mego-worthy industrial noise, controlled by her highly playable Wekinator software.

Hannah Davis walked us through her TransProse project, which translates literature into music through MI-based emotion mapping.

Michael Tyka presented an outline of art history, observing the accelerated nature of kitsch and the absorption of novelty by art viewers.

Chris Olah demystified neural nets so we could understand them as simple high-dimensional manipulations of geometry.

Artist Tivon Rice capped the session off with drone photogrammetry of urban architecture, coupled with neural-storyteller text generated from these images and trained on corpora of city planning submissions and public responses.