# EXPERIMENTS IN ENTROPY:

Two chambers and their contents mixing, digitally, visually.

Lately, I’ve been trying to understand the rather difficult concept of “Entropy” (in both thermodynamics AND information theory). To try to apply what I thought I understood about entropy, I decided to do a simple exercise or experiment in a visual medium.

I decided that I would take an image (256 pixels by 256 pixels) that is half-white and half-black. I would then apply a “function” and keep applying that function for “n” number of steps/iterations.

The function I chose was essentially a “sprayed strokes” filter in Photoshop (under “Brush strokes”), but any function would work, so long as it “mixed” the two sides of the image together in some way.

The idea is to take a “seed image” and “operate” on it with a function that “modulates” it a little, and then keep applying the same function, so that you move further and further away from the original image.

In any case, here is the result. I think it speaks for itself. I called it “Entropy 0.01” since this is an ongoing series of experiments in design.

To recapitulate a little, I began this series with a simple image: A 256 pixel by 256 pixel square that is half-white and half-black on the horizontal axis.

What I was imagining was a kind of thought experiment inspired by the famous experiment called Maxwell’s demon. Essentially, I was imagining two “chambers” if you will, one with white pixels, the other with black pixels, separated by an invisible wall. What I did, in the thought experiment, was “remove” the wall between them, and then the two sides would magically “mix” together.

You can see what I mean by seeing the second step in the series/experiment.

As you can see, the two sides begin to mix AT the location of the “invisible wall” which, in my imagination, separated the two “chambers”. Now, recall that this is a visual experiment in trying to understand the concept of “entropy”. What I was imagining was two chambers, one with a very high temperature, the other with a very low temperature. I imagined that when you opened the “door” between them, that the temperatures would “mix” if you will, or the atoms or molecules or whatnot, and that eventually one would reach a state of “thermal equilibrium”.

Again, we see the two sides “mixing” further.

Ideally, what I was trying to show was the iterative (“repeated”) application of a function to the original image that would add more and more “disorder” to the image. I began with something very “ordered” in that regard, i.e. a binary image with half of the pixels (on one side) white, the other half (side) black. I would take this image and slowly modulate it to add more and more “disorder” or what in essence ends up being tantamount to adding “noise”.

Through this iterative process, I couldn’t help noticing that the file sizes were growing with each iteration. At first, I thought that file size could be a good proxy for the “entropy” of the given image, but it turns out that entropy is not exactly measured in this way. I thought that through the process of adding “noise” to each subsequent image, that I would be adding “randomness”, and that a measure of this randomness could show up in the file sizes. What I wished I could have was a measure of “image complexity”, if you will. In any case, it’s not as simple as that. It’s really non-trivial, and is something I will be investigating, but something I know very little about at this moment in time. Take this as a footnote: I will be investigating image complexity / entropy, and getting back to you with any findings.

In any case, back to our sequence of images. As you can see, the image begins to look more and more like some sort of “noise”. Now, we know that “noise” generally is characterized by “randomness” or the appearance of “patternlessness”. Technically speaking, our image is becoming less and less “redundant”, less and less “generic”. Unpredictable things are happening to it, “uncertainty” is augmenting in the image.

One can see a certain “texture” emerging from the repeated application of the filter. I ended up trying to calculate the so-called “Shannon entropy” (in bits) of each of the eight images. For what it’s worth, I got these numbers: 2.5849184727146532, 2.7522213169024425, 2.848670077557398, 3.110843614098687, 3.3301156026952414, 3.466307758942923, 3.47840449654465, and 3.4697433068161705. It seems that the entropy is growing after every iteration, though near the end it goes up then down. File sizes, for the record, are, in increasing order (in bytes, B): 14977 B, 24930 B, 33596 B, 62284 B, 93050 B, 117996 B, 122052 B, and 123593 B.

So that’s that. I will be returning to this subject. I just wanted to get a few od these ideas on paper.

[FINIS OPERA]

Post-Scriptum:
The idea here is to explore concepts, like the concept of entropy, through experiments in design, visual design in this case. The actual math and theoretical computer science is secondary. That is, this is an art-based investigation, through series, in this case a series of images made by the repeated application of what is called an iterative function.

I’m interested in seeing or trying to see, investigate, explore, etc., how these kinds of concepts, scientific, thermodynamic, in this case, TRANSLATE to aesthetic representations. The goal is not to be rigorously scientific. It is to explore the abstract spaces of these concepts and try to come to new and deeper understandings of said concepts. The objective is to do it through the tools of a mature artistic practise.

If anything, I learned that the concept of ENTROPY in both thermodynamics AND information theory, is very complex, non-trivial. The basic idea is that “the entropy of an isolated system never decreases”. To try to experiment visually and aesthetically with that idea, I tried to see if I could make the “entropy” of an IMAGE increase, through the repeated application of a FILTER, or image processing function.

That is to say, I start with an image I call a seed image. I then apply a function to it iteratively and carefully observe what happens. I have done this trick many times and have learned many things each time. For instance, I recently did this same thing and went through over 400 iterations. What I noticed was that the file sizes would quickly increase, then eventually they would “stabilize” at a sort of maximum point. It was then that I realized that if I made a graph of the effect on file size of my subsequent “changes” to an image, that I would perhaps “see” something in the plot that I could “relate” to some visual aspect of the images in question, of their visual modifications over time, via the iterative function.

I made a “tentative” plot of the growth in file size over 50+ iterations of a simple function I applied repeatedly to a binary image.

I now know that this is a very bad proxy for “entropy” or even “complexity”. I had hoped that I could measure the “growth” in complexity of an image over time, by subsequent modifications made to it via filtering. I have experimented with both lossy AND lossless compression, with different image file formats, different sized images, with binary images (black-and-white) and with colored images.

The idea of working in “series” or “sequences” like this is that it is a form of “evolutionary” art. That is to say, one starts with a “seed” image, then modifies it in an iterated fashion, creating “populations” of images that are judged or selected according to a “fitness function”. The main constraint for now is that the images that “succeed” to the next generation all have a certain level of “interestingness”.

What happened here was interesting. Once the file size “peaked” at a certain point, future modifications of the image, using the chosen operation/filter/function, seem to have caused the file size to “oscillate”, growing in size and then diminishing in size, back and forth.

This is mainly how I discovered that I WAS NOT measuring the “entropy” of the image, or that I poorly understood what entropy was, and how it related to complexity and randomness. In this case, I also poorly understood how JPEG compression works. It turns out that in certain cases, compression like this actually causes the file size to GROW instead of get smaller. The complete lack of regularity can often have that effect.

In this experiment, I took a white image and began adding “noise” to it, iteratively, over the course of 96 “generations”. Here there is a certain sense of “linearity” that one gets, i.e. the image “slowly” grows darker and darker with more and more black pixels, a.k.a. “specks”.

Lastly, a simple plot of the values I gave earlier for the apparent growth in Shannon entropy of my sequence of 8 images, image by image, iteration by iteration.