from the 19th of April to the 27th of July Fletch Graham and I worked on making one piece of art (almost) every single day. Fletch’s were 100 animated loops, mine were 100 days of toolbreaking. For those who don’t know, toolbreaking is pushing a platform or application in a way it was not intended to be used to test its boundaries and attempt to create something unexpected. Part of the reason toolbreaking is so important is that as more and more platforms and applications are built, and software engineers get better and better at defining what we can and can’t do with those platforms, there will be less and less room for experimentation—everything “experimental” you’ll be able to do will have already been thought of and accounted for by the architects of the platform.

Here I’ll recount what I did over those 100 days.

(If you followed the project from the beginning and just want to skip to my *~*~thoughtz~*~* on the project, jump to the bottom)

001, 002

001
002

the first two days were done with scripting and illustrator, a technique I picked up from Keetra Dean Dixon (& JK Keller) when Keetra came and spoke about toolbreaking in a workshop at CCA in spring 2015. Since illustrator has a native scripting functionality my requirement for these was that the script had to push illustrator into a fatal crash, and then I could only post what was recovered when the file was reopened.

003

003

on day three I made a daily from a bar with the selfie-improving app FaceTune, as I would go on to do quite a few times over the next 100 days.

004

004

inspired by Fletch’s usage of Sculptris, I decided to mess around with it and found that inverting the flatten tool created some very interesting shapes. I then rendered it in Blender (again, Fletch’s process). Pretty sure this is the most work I put into any daily in the entire 100 days.

005

005

Using the then-new snapchat face-swap feature I put my face onto my own face 100 times expecting there to be substantial distortion over many generations. There was distortion (generally it turns the center of your face red and blurry) but not as much as I expected.

006

006

More of 004.

007

007

On day 7 I started to explore how you could Computer Vision into some of the otherwise normal programs I use regularly. After a bit of research I found SikuliX, which “uses image recognition powered by OpenCV to identify and control GUI components.”

I gave the script relatively open-ended parameters (find a white square, mouseDown, drag to another white square, release & repeat). I also wrote in a couple lines to have it switch tools after several turns with each tool.

007.1

008, 014, 016

008
014
016

For a long time I’ve been interested in how GUI elements and default colors/fonts affect our design choices. Spurned on by the New Aesthetic of back when, I designed several posters that were essentially just meticulously recreated vector screenshots. So for day 8 I decided to explore creating things entirely out of UI elements in InDesign— frames, lock icons, etc. I started making InDesign Quilts, as I took to calling them.

009

009

9 combined the previous two days. I equipped my SikuliX drawing script with the same set of tools I used in the non-printing InDesign script and let it go to town. I could definitely write a code that would produce more structured results like what I was making myself, but I didn’t!

010

010

So our rule for the 100 days was every 10th day Fletch and I would collaborate since we were doing these side by side. For our first collab we did what seemed most natural: datamoshing one of his loop animations.

011

011

On day 11 I started to explore the .raw filetype. I’ve messed with it before while doing photography projects but never in a methodical scientific way. So on 11 (and in later days) I started to really examine how the settings you open a .raw file with change the image. For a detailed explanation of how this works, jump to day 22/23.

012

012

In early April, Fletch and I worked on visuals for a dance performance, one of which included falling fabric set up in Blender. On my first pass through (since i’m a total noob with 3D programs) I applied the cloth modifier and a subdivision surface modifier in the wrong order, creating a weird broken object. I quickly fixed it at the time, so I decided to revisit it for this project.

013

013

My most liked daily. Probably for mocking Trump. Converting a photo into vector brush and outlining type with it. I think the underlines are what made this so 👌

015

015

In an attempt to freshen their look, a lot of office applications are trying to incorporate flashy things into their design options. So 15’s process comes from Apple’s Numbers application. There’s a whole suite of color-coordinated rock textures that you can decorate things with. Also you can write scripts to randomly fill in tables. Combining those two things, I created painting out of rocks with a spreadsheet-generated graph.

017

017

A study related to the indesign quilting but not an indesign quilt in and of itself. this is a color study of what happens when you interlace the frames on different layers, using only the default color values assigned to layers. The shapes are layer 1 & 2, 2 & 3, 3 & 4, etc. After 32 layers indesign reuses the first default color. So using the default layer colors there are actually 1,296 combinations if you’re only combining 2 of them. If you combine 3 (which I never really did) you get up to 32,768 combinations and that’s just nonsense. I didn’t pursue this direction much through the remainder of the 100 days because each quilt was fairly labor intensive. I still really like this method and intend to return to it though.

018, 019

018
019

Two more mobile-created dailies. The first one done after 6margaritas while waiting for Fletch at a mexican restaurant, the second from the house of a girl I was seeing.

020

020

Collab number 2, since the .raw method I was messing with earlier in the project seemed to naturally lend itself towards the creation of textures, I created two different images for Fletch to use: the background made out of (jargon warning) an interleaved raw that was opened as interleaved and skewed, and the subject made out of an interleaved raw that was opened as uninterleaved.

021

021

A good friend of mine sent me a video of a tornado that was taken in Fairdale, Illinois in 2015. The town was all but destroyed; video of the aftermath shows wide swaths of empty ground littered by debris alongside mostly undisturbed houses. In an effort to better understand the video of the aftermath, I pulled up the town in google streetview to follow the person capturing the video as they moved through the rubble. A little while later I realized that the most recent imagery from the town available on streetview is before any of this destruction occurred. We so often see streetview and google earth as an informational representation the current moment in reality. though there are timestamps attached, quite a few people still believe that streetview is updated on monthly, weekly, or even shorter basis. “walking” through a town in streetview that you know to be destroyed reignited a fascination with streetview and google earth that I hadn’t felt since 2013, and it would continue to show up through the rest of the project.

But this was before I knew how I wanted to use streetview and google earth in the whole project so instead I explored iPhone apps that purported to take a 3D scan of an object.

022, 023

022
023

Using what I’d learned about automating interaction with SikuliX earlier in the project, I decided to use it to start automating ideas I’d had long before but hadn’t had the patience to create. In 23 and 24 that was opening a .raw file and specifying an increasingly smaller and smaller width. Here’s my attempt at explaining how this works:

When you save a .raw file, you specify the width. However, you have the option to tell it how wide the file is when you reopen it. So if you save a 20X9 raw it writes the data like this:

1111111111111111111
2222222222222222222
3333333333333333333
4444444444444444444
5555555555555555555
6666666666666666666
7777777777777777777
8888888888888888888
9999999999999999999

Which creates an image that looks like, for example, this:

a picture i took on vacation circa 2013 or so

Now, if you open the .raw file and tell it that it’s not 20x9, it’s actually 19x9, the code is then interpreted like this:

111111111111111111
122222222222222222
223333333333333333
333444444444444444
444455555555555555
555556666666666666
666666777777777777
777777788888888888
888888889999999999
(the remaining 9s are truncated)You can see the dividing line extend through the data:.111111111111111111
1.22222222222222222
22.3333333333333333
333.444444444444444
4444.55555555555555
55555.6666666666666
666666.777777777777
7777777.88888888888
88888888.9999999999
which marks where image data that would be on the right ends up on the left instead.

Leading to an image like this:

\

as you specify smaller and smaller widths the offset becomes greater:

\\\\\

skewing the image more and more.

024

024

More datamoshing, now with gabe the dog.

For a great explanation on what datamoshing is, how it works, and how to do it go here.

025

025

A sketch from some indesign quilting since on that day I was unable to make anything particularly compelling.

026

026

A selfie imported into illustrator, then live-traced and recolored using a random-color script.

027, 028, 029

027

The earlier streetview post by now had me considering the archival aspects of streetview and similar technologies. While writing a brief history of my life as catalogued by streetview imagery, I noticed that the 2011 imagery of my street had a leaf stuck to the camera, so I followed it all the way back to where the camera ran through some low hanging branches and caught the leaves, then made an animation of it.

028

The branch served as a reminder of the fact that every time streetview imagery is captured, it was actually a specific moment in time. There was a person at the wheel of a car actually driving through an area and experiencing a timeline of moments, and thus, all streetview imagery is actually a timeline. So I wrote a script that would take a screenshot in streetview, step forward, take another screenshot, etc. then compile them into an animation. This allows us to experience, somewhat, the moments that the drivers captured them as they actually appeared.

029

streetview has routinely been held up as an example of digital traveling, how we can see into areas we will likely never go physically within our lifetime. To me this heightens it even more, as we can now get a temporal experience almost anywhere we want.

030

030

datamosh is a really natural jumping off place for Fletch and I to collaborate, and we started down that road for 30, but realized we should switch things up a bit. so instead of just using the datamosh outright, we took a frame out of it and used it as a displacement map for a plane in Blender.

031

031

More bar dailies. Probably (definitely) the one I put the least amount of work in to.

032

032

While looking for a new exotic location for me to streetview-drive, I ended up dropping the view into a tunnel, where I noticed that there was a visible banding effect created by the lights in the tunnels. This is because high-pressure sodium vapor lamps (the orange-colored lights used often as street lights) actually strobe really fast, so if you combine that with the fact that the streetview car is moving and it’s using a rolling shutter, you’ll get banding distortion. Once again, artifacts present in technologies that try to present themselves as contextless remind us that this was a time and place, captured with a technology specific to a point in time. This spurned on a further investigation of where artifacts can be found:

033, 034, 035, 036, 037

033

I started to look for areas where I knew I could find these traces. First being the indicator to move forward or back, where you can also see traces of how google tries to hide the car from the image using a kind of content-aware fill.

034

Road led to sky, where you could see how individual timelines were pieced together based on the appearance and disappearance of grime on the lens.

035
036

in some places it’s easier to spot the ghost of the car.

037

and another in which I was not actually trying to catch any elements, but was rather focused on the clouds. In most cases, if your cursor is positioned high enough that it would be above buildings and trees, there will be no indicator it’s there. However, from what I can tell, this scan data thinks there is a distant plane in the sky, and that’s why the wall-selector cursor you can normally see on buildings and structures is visible far off in the clouds.

038

038

In an attempt to get away from streetview drives, I decided to apply my .raw skew script to instead explore the dither settings of .gifs coming out of photoshop. So here the script just saves increasingly smaller gifs of the same image with standard dither settings.

039

039

Bar daily! This one is actually a portrait of Kinsey Zaire.

040

040

More displacement maps, I was on an insane Dark Souls bender so I decided to see how we could use elements from that in something.

041

041

Dark Souls selfie mobile app distortion!

042, 043, 044, 045

042

a while ago I had seen someone who had replaced all of the data of a .jpg with lorem ipsum in a text editor. I was curious how this would work with other text and other methods, so I set out to explore that for a couple of days. This first image is replacing the data of a white 1080 x 1080 .jpg with the entire script of Bee Movie (because memes). It made a pleasant pink/grey/white gradient I’m compelled to use in some future design.

042.jpg open in TextEdit, the .jpg header kept intact
043

043 used the same method, only this time with the script of Shrek (also because memes). An oddly fitting color, I might add.

Of note: if you invert the color of this jpeg you get something vaguely similar to the color of 042. I looked briefly into information as to how .jpgs are encoded and how you could actually type out a .jpg in a text editor, but there was little information available on the subject so I decided to table that until later.

044

while trying to type out a .jpg from scratch it ended up completely broken with little time to spare for the day, so I instead just posted the #1 google image search result for .jpg: a .png.

045

045 involved opening a .raw file in Audacity, and then applying reverb, echo, and other noise effects to the data and re-opening it as an image. This specific pattern resulted from a sine wave decreasing in pitch over time with a slight echo applied.

046, 047, 048

046
047
048

For these three days I was camping with almost zero cell signal (thanks T Mobile). So I utilized my bar-daily method and posted them through Fletch’s phone. 047 is the only piece in the entire 100 days that is an entirely unedited photo.

049.1, 049.2

049.1
049.2

At this point I’d used the selfie app on images to distort their 3D mesh often enough that I decided I should try to utilize a different method available within the app. There’s a “detail” brush that basically emphasizes colors and sharpens edges at the same time. Starting with a completely white .jpg I applied the detail brush hundreds of times to the same spot in the image, where, after a while, it finally revealed some color that was present in the image, perhaps pulled from artifacts as the .jpg was saved repeatedly.

049.1 is just the effect of the detail brush, 049.2 includes the 3D mesh distortion because I just couldn’t get enough of that.

050

050

For the halfway mark I decided to repurpose my methods from day 4 onto one of Fletch’s sculpts.

051

051

In school I took a type design course and had always intended on revisiting it someday. So I decided to check out Glyphs. I challenged myself to make a typeface in 24 hours, which I managed to accomplish thanks to mekkablue’s Noodler plugin for it. Afterwards, I decided to experiment with making some scripts of my own. They weren’t pretty, but they weren’t meant to be.

052, 053

052
053

For 52 and 53 I got really into the exploration of image feedback, particularly as it relates to the moire pattern that exists when you photograph a screen. So I explored taking a photo, importing it, taking a photo of that again, and repeating.

054

054

On 54 I made a dumb joke that is (and even was at the time) already out of date because instagram updated their UI

055

055

More photo feedback experiments.

056

056

On day 56 I accomplished the most broken tool of all: I broke my phone. This resulted in a period of downtime where I wasn’t able to post to instagram.

060

060

I actually borrowed Fletch’s phone for this one. Taking what I’d learned from the photo-feedback process I decided to apply it to a loop that he made and see if it was as interesting as the photo versions (it’s not, lol) this “loop” lost him a lot of followers.

061

061

lol. messing with the stock photos that came with my selfie app.

062, 063, 064, 065

062

There were news stories around this time about how there might have been a kraken spotted on google earth (sadly it’s not a kraken, just Sail Rock off the coast of Deception Island). While looking around Antarctica I noticed that quite a bit of the area was made out of noticeably different satellite images. So I started to get really obsessed with finding these google maps collages.

063

They occur most often in areas toward the peripheries of the planet, where I imagine it’s harder to get satellite images.

064
065

066

066

While looking for more found collages I found this beautiful pattern near an atoll and couldn’t pass it up.

067

067

Another friday, another bar daily

068, 069

068

I went back to looking for found-collages because it was simply so captivating for me.

069

070

070

The map imagery even infiltrated our 7th collab.

071, 072

071

After looking at enough of these found collages I realized that the entirety of google earth is actually made out of a collage, it’s just that in most places the seams are less visible and the tiles that make up the collages cover a larger area and they disappear when you zoom out. So I wrote a simple script that made it so those tiles would stay visible as you zoomed out further and further, allowing you to see more tiles across a wider expanse of the world.

072

073

073

Realizing that I was not doing 100 days of found google earth collages I decided to try to switch it up by returning to an earlier method: the snapchat faceswap.

074

074

074 is the best bar daily I made over the course of the entire thing. Admittedly in earlier ones I took a much less meticulous approach to them, kinda just going with what felt right. But by 74 I started to have a better grasp on how to push the limits of the app—what happens when you pull the 3D mesh from an area beyond the canvas, from a corner, etc.

075

075

For 075 I started trying to learn how to use the Unity Engine. One of the tutorials on their site explores cellular automata. I found that really interesting so I worked with that to make a slightly modified version that better fit what I was looking for and fit to instagram’s constraints

076

076

Another phone daily, trying to continue my tactful usage of what i’d learned over time.

077

077

Fourth of July! Flags! Was it toolbreaking? Nope.

078

078

080

080

Cassini had just reached Jupiter, so in the spirit of that, Fletch and I found a collage-like image of the photos it had taken (in the spirit of my maps obsession) and projected it onto a sphere.

083

083

It’s day 83 and Pokemon Go has been released! My apartment building is also a pokestop, so there’s plenty of pokemon to be found. Taking from my earlier experiments with photo feedback I decided to utilize the AR functionality of the app to create pokemon-filled photo feedbacks.

084

084

Mostly I use the script that shows more google earth tiles in mountainous areas around the north or south pole because that’s where the tiling is most obvious. For 84 I decided to seek out other locations that might result in interesting effects.

085, 086

085
086

As Pokemon Go is still a newish app, there’s a couple of interesting glitches you can get out of it. At day 85, if you entered and exited the app while looking at a pokemon egg the game would glitch and instead display what appeared to be a sprite of typography.

Also, by launching the app in airplane mode after server maintenance your avatar would be teleported to a watery void.

087, 088, 089

087

By this point it was becoming a lot harder to find new tools to break, so I began to scour the default apps within OSX. One of them that I found was the Grapher application, for creating 3D graphs. It’s a powerful tool when used for what it was intended, but the aspect I was most interested in was the styling elements available to the user. The user has control over material color and background color (both of which can be gradients!), but they don’t have any control over the shaders used, resulting in that slightly flat, slightly shiny early 3D program look.

088

Another element to my usage of the grapher was that I was using a SikuliX script to have my computer enter the equations itself by randomly selecting from equation elements. It would randomly pick between things like x, y, cos(), tan(), sin(), *, +, -, /, etc. Afterwards I would style the creations.

089

090

090

A while ago I changed the default style of my terminal window for kicks, and ever since then I’ve thought of it a little more as an extension of artistic possibilities. Much like how Skrekstore sells “dockgems” that are just decorative icons for your dock, I thought about creating something that could just exist as an ever-moving decorative object to look at when bored at your computer. So I wrote a short python script that decorates a terminal window with a mini “snowstorm”

from random import randint
import time
#specify x as the width of your terminal window for best effect
#y/10 = number of seconds the script will run
def flurry(x=50, y=50):
for i in range(0, int(x*y)):
if i % x ==0:
print('\n', end='')
time.sleep(0.1)
decider = randint(0,7)
if decider == 0:
print("~", end='')
elif decider == 1:
print('*', end='')
elif decider == 2:
print('°', end='')
elif decider == 3:
print('.', end='')
elif decider >= 4:
print(' ', end='')
print('\n')
time.sleep(3)
flurry()

This requires Python 3 btw, the print functions don’t accept “end” arguments in older versions.

Fletch and I both managed to forget to collab, so that happened on 91.

091 (actual collab)

091

inspired by deep fried memes and the various .jpg deep friers that can be found on the internet, I decided to make a video deep frier. Fletch made a nice crisp sleek animation so naturally I had to toast it lightly.

092

092

The video deep frier makes use of the FFMPEG command line tools to demux/remux .mp4s into .avis back and forth many many times until the result is pixelated, slightly datamoshy, and the sound is blown out. Just all around good times.

093

093

I started seeing more and more posts on instagram and elsewhere of pictures that had been processed with an app called Prisma, and decided to investigate it. The primary draw of the app is that it’s applies painterly qualities to a photograph that mimic an artist/style from history. I noticed a “Roy” filter obviously named for Lichtenstein and decided to see what a Lichtenstein filter would do to an actual Lichtenstein.

094

094

After rediscovering my love of Pokemon because of Pokemon Go and totally not downloading an emulator to play old Gameboy Advance games, I got really interested in the file structure of .gba files and trying to learn more about how the data is encoded compared to other filetypes. Because of their ubiquity it is easy to explore and attempt to understand image and video filetypes, but it’s really interesting to probe something a little less common like this.

So I downloaded this app (I don’t really like the name because it seems like it’s trying to make a joke about CSA, so heads up if that’s something that might be triggering to you) and just poured through the sprite data. This image is from the sprite data of Pokemon Sapphire.

095

095

The final bar daily! Now with a purposefully limited color palette and generally more consideration.

096

095

Another sprite-rip. This time from WarioWare Inc.

097, 098, 099

097

For the last couple of dailies I realized that the script I had been using to find areas of tiling in google earth would still work in the 3D view of google earth, allowing for semi-realistic seemingly aerial photos to be taken of these satellite artifacts.

098
099

You can also flip back and forth between a GPU-intensive application and google earth (in browser) to de-load some of the tiles, ending up with broken landscapes like this.

100

On day 100 we combined elements from our more popular dailies for a send off. Fletch’s sculpted head and a tiled view of Mount Crosson. Done and done.

Reflections

Anybody who’s participated in something like will tell you, it’s hard to do something every day for 100 days (and the people who’ve done 365 day projects are laughing at us). Making something of quality is even harder. It’s a good meditation though, if only because keeping yourself moving by creating and sharing something regularly keeps one loose and, at least for me, more likely to explore stranger directions. The work I shared immediately after periods of time where I fell behind was always more stilted. Even now, it’s been two weeks since I finished the project and already I can feel myself returning to a more stiff way or working and a reticence to share what I see/what I am up to. Everybody likes to knock oversharing but there’s something we can learn from it: a looseness in the approach to what we make, how we display it, and how we talk about it.

Another thing that I noticed, which is rooted in the impetus for this project to begin with, is how hard it is to find ways to break rules in a digital environment. I frequently tried to find a new way to bend or break an existing digital tool, and as the project went further and further I found it very hard to find new tools to break. What this made me realize is just how few tools are available for creating things in the digital environment without resorting to straight up code. For those who aren’t technologically literate enough to know a code language or two, the pool of tools to draw from to create imagery is frighteningly small. It gets bigger when you test the peripheries of it, but not by a lot. Not by enough. And that wiggle room is only going to decrease in size as software engineers tighten up shit they don’t intend to happen

(Ī̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̄̄̄̄̄̄̄̄̄’Ṃ̣̣̣̣̣̣̣̣̣̣̣̣̣̣ Ļ̧̧̧̧̧̧̧̧̧̧̧̧̧̛̛̛̛̛̛̛̛̛̛̛Ō̱̱̱̱̱̱̱̱̱̱̱̱̱̱̄̄̄̄̄̄̄̄̄Ō̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̄̄̄̄̄̄̄̄̄K̨̨̨̨̨̨̨̨̨̨̨̨̨̨̨̛̛̛̛̛̛̛̛̛I̛̛̛̛̛̛̛̛̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱N̨̨̨̨̨̨̨̨̨̨̨̨̨̨̨̉̉̉̉̉̉̉G̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̉̉̉̉̉̉̉̉ Ạ̣̣̣̣̣̣̣̣̣̣̣̣̄̄̄̄̄̄̄̄̄̄̄̄̄T̨̨̨̨̨̨̨̨̨̨̨̨̨̨̨̉̉̉̉̉̉ Ỷ̨̨̨̨̨̨̨̨̨̨̨̨̨̨̈̈̈̈̈̈̈̈̈̈̈̈̈Ō̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̉̄̄̄̄̄̄̄Ū̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̄̄̄̄̄̄̄̄̄̄̄̄̄̄̄, M̧̧̧̧̧̧̧̧̧̧̧̧̧̉̉̉̉̉̉̉̉̉̉Ẻ̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱́́́́́́́́́́Ḍ̣̣̣̣̣̣̣̣̣̣̣̣̉̊̊̊̊̊̊̊̊̊I̛̛̛̛̛̛̛̛̛̛̛̛̛̛̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱Ū̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̉̈̈̈̈̈̈̈̈̈̈̈̈̈̈̉̉̉̉M̧̛̛̛̛̛̛̛̛̛̛̛̛̱̱̱̱̱̱̱̱̱̱̱̱̱̱̱̉̃̃̃̃)

and that’s great when the exploits being fixed could cause legitimate security issues and all, but what does that mean for the weird, unintentional consequences of fucking up? And what we can learn from fucking up? Pretty much every b̸o̸g̸u̸s̸ inspirational speech is about learning from fuck ups. Making from fuck ups. Experimenting from fuck ups. What if in the future fucking up just gives you an error message, kicks you back to a functioning workflow, and nothing comes of it for anyone involved? I worry that it’ll close gaps through which we can see a new result, one we wouldn’t even have known we wanted until seeing it in the distance with our own eyes.

That’d be fucked up.

--

--