MICROGRAPH STORIES
Published in

MICROGRAPH STORIES

Above the 30 sec Festival Trailer of IDFA 2020; Real Emotions. Real Film.

IDFA x Imaginarium of Tears

Every tear tells a different story. Real emotions. Real Film.

Starting this week, November 2nd 2020, IDFA’s new festival campaign commercial can be seen in theaters. It’s safe to say that most of us experience a wide range of emotions while watching IDFA films. Examining tears under the microscope thus reveals a kaleidoscope of feelings: a unique close-up of the diverse emotions we experience when watching different documentary films. In this way, the images of the IDFA 2020 festival campaign are, quite literally, taken from the real effects that film have on us.

This campaign created together with creative agency We Are Pi, takes the purest expression of emotion as its starting point: our teardrops.

Intro:

In this article I’m going to take you through the (thinking) process of creating these artworks for the festival campaign. A beautiful collaboration between We Are Pi x IDFA x Imaginarium of Tears.

As you all know I have been working with tears for years now. Although as you see this festival commercial has a different look and feel to it than you normally would expect of Imaginarium of Tears.

On the right you see interpretation of the campaign with original project images created bij Imaginarium of Tears, on the right the IDFA campaign look and feel.

For a long time I have been experimenting to show tears in a different and unique ways. I have always been relying on the dark-field technique when it comes to the creation of these artworks. This is because the dark-field technique is chosen due to its technical and optical advantages, the way of imaging as well as for artistic reasons. Turning tears into unique works of art with the dark-field technique give the best optical representation when imaged through a microscope with a high magnification. Details of the crystals pop out on a slight dark blue background and show complex beautiful fractal like shapes and patterns. Especially when this technique is combined with taking hundreds up to thousands high res images in a comprehensive grid when later stitched together. Resulting in one super high res image of a crystallised teardrop as we are familiare with from the project. Something most other techniques oblique, cross polarised or DIC could not tip to when combined with comprehensive grid imaging.

A simplified example of a comprehensive grid imaging example, having 10 X axis rows and 10 Y axis column resulting in 100 images that need to be stitched together.

However this photography process could not be implemented into video. The reason is that the photography process is done after the tear is crystallised. With video we want to capture the whole process, from liquid drop to its solid, crystallised form. It would not be possible to film individual zoomed in sections of this grid without missing out of all data in the other sections when the tear is crystallising live under the microscope.

Therefore, we need to use a lower magnification 4x objective instead of 10x objective, and at the same time make smaller drops to make the tear fit our 35mm cinematic view.

At this point my first custom build microscope scanning stage would be obsolete, this is because of its original design. Movement goes in steps following a specific programmed grid and was not build to create accurate, smooth and freely programmable cinematic movements. The software was designed by me specifically for that job, and the implementation of the hardware already hit his limitations.

I needed to redesign the old scanning stage, to enable us to do both photography and cinematography, without losing compatibility and functionality of my first photography focused build. Since the last version of the stage was updated in 2016 many desirable upgrades could be implemented in this new "2.0 version", improving the photography system along the way.

Technical Pro’s & Con’s; photography VS cinematography:

The interchangeability of photography and cinematography methods and technologies where already briefly discussed above. Before we go in to the details of redesigning the old scanning stage, I want to talk about some pro’s and con’s related to the technical process of crystallising tears under a microscope.

Photography generally only allows us to capture the final result, since the motion of the crystallisation does not show up in a photograph. Though it allows us to take multiple pictures at high magnifications when the tear is completely crystallised. When applied I’m able to create the artworks we know, made up out of hundreds up to thousands of images stitched together. Sowing the tear in its full glory with the highest detail possible. The dark-field technique gives us the freedom to create larger drops, zoom in further and keep a uniform coloured “stable” background without the shift of light intensity or color which would happen with for example the DIC technique.

Pro:
More detailed artworks by using higher magnifications and multiple shots stitched together.

Con:
Due to higher magnifications stitching is needed to create a complete overview of the tear. Therefore multiple shots are needed with a uniform light and color distribution through the specimen. Imaging with for example the DIC technique does not provide these requirements.

Cinematography is different since we can capture a continuous flow of frames, and show the motion and crystallisation process of the tear without having to pick one or a specific “state”.

Pro:
Since we don’t need to stitch surrounding frames together, it gives us the freedom to use al techniques provided by the Nikon TMD microscope such as the DIC technique without time consuming or (almost) impossible post processing. We don’t need to worry about inconsistent light or color distribution and can capture the whole process form liquid drop to solid crystallisation.

Con:
Lower magnification due to framing “all at once”, smaller drop sizes, effectively resulting in les resolved details in the tears displayed.

In conclusion both techniques have their own use. We are using cinematography for this campaign as it gives us the ability to more than only the Dark Field technique.

Plutchik’s Wheel of Emotions

Due to the “colouring” option the DIC technique gives we had more creative freedom to implement our concept of showcasing different tears and emotions in unique cinematographic ways. Plutchik’s Wheel of Emotions played as a guide to match the tears and stories of our donors with colour used in the festival campaign commercial.

Plutchik’s Wheel of Emotions

Heading into the next chapter I will take you on the journey of how these steps were translated into the technical interpretation and execution creating the new system to control the microscope for our cinematography.

Redesign of the custom microscopy stage

An automated, smooth, accurate and repeatable stage was to be build to create dynamic cinematic shots whilst working with tears. These changes consist of the following:

  • Redesign of the current X-Y scanning stage including mounts, hardware and software. This to get more accurate and smoother movements on the X and Y axis of the microscope table.
  • Adding a Z axis, including mount, hardware and software implementation to control the focus of the microscope to be able to create smooth focus (ramping).
  • Adding a so called A axis, including mount, hardware and software implementation to control the Second Nomarski-modified Wollaston prism. This would in easy terms; allow us to “add and change colour of the background and specimen” seen in the festival campaign commercial.

And so it was time to acquire hardware. Most of the parts needed are used in the field of robotics and 3D printing, which are easily accessible nowadays. This simplified the iteration and building process which in turn gave me more time, since I only had 2 weeks to achieve this all.

Disclaimer; In this post you may see some discrepancies when it comes to the designs, since I did not always have time to retake images when changes occurred. Most of these designs changed during the building, testing and even after the cinematic production of the commercial.

X-Y axis mount:
The first step was to redesign the current X-Y stage with the new lead screw step motors. Creating a new microscope stage mount was essential in this process. The goal was to have more accurate smooth movements with stricter tolerances than the previous design of the custom microscope stage.

Latest version of the X-Y stage mount holding 2 lead screw motors and 2 end stops to limit and “calibrate its positioning to 0,0 at the start.

Z axis:
For the z axis it was a bit simpler to implement a system to control the micro focus knob of the microscope. We only need a mount that holds a step motor with idler pulley. A timing belt was attached and strung around the micro knop of the microscope.

Latest version of the Z axis mount holding step motor that controls the microscope its focus knob.

A axis:
For the A axis I had to find a way to connect the turning knob of Second Nomarski-modified Wollaston prism of the DIC under the stage to a step motor. This was done by printing a flexible shaft coupler and attaching them together. The step motor itself is placed in a mount that can be attached to the previous made support mount of the DIY climate chamber.

Latest version of the “A” DIC axis mount holding step motor that controls the shift of the Second Nomarski-modified Wollaston prism.

Endstop:
To “calibrate” the microscope to achieve consistent movements, endstops are key in the system. These help to prevent the X-Y-Z and A axis to move out of bound.

Control:
To address all axes individually I need to connect the step motors and sensors to a controller board. The decision was made to use two Arduino Uno’s with each a CNC Shield V3 and 2 motor driver controllers. Since I did not have time to write my own program, like I did with the scanning stage (to automatize the imaging process), I had to become a bit more creative with my options. This combination of hardware was widely supported in the laser cutting community and gave some choices regarding connectivity to supported software.

The first controller will be used to connect and interface with the X and Y stage / axis. This would allow me to “simply” use the light burn software to control the stage manually, or to “draw”and execute shapes different speeds into the cinematic movements needed.

With some trial and error, calibration and practice hopefully this would give me the creative freedom whilst recording the crystallization process. You could compare this with moving with a dolly or robot arm stabilized through a scene.

Separately connecting and interfacing with the second controller was needed to control the Z or A axis, since light burn did not support an easy way to control Z or A independently while running a “scene” on the X and Y stage / axis. This was done by drawing blokwaves “drawn” at different speeds that represent a specific distance and direction of the step motors to turn. In this way I was able to pull of focus racking and/or changing the “color spectrum” of the DIC.

Above you can see the first version of setup at work whilst calibrating and experimenting with the hardware and software. (disassembled climate chamber for easy access to all the different parts)

As you may understand getting to this result and described actions above was not just a straight line, more a process where big and small iterations on 3D printing parts, tweaking hardware, matching the software and figuring out creative ways to get all these things working together in a short period of time. Once this all was good enough I started the calibration.

Calibration & setup:
These test where done on a microscope calibration slide, calibrating hardware and software to perfectly match real life movements. Making sure we can make repeatable accurate movements with good tolerances. Homing/zeroing our starting point at the bottom right of the slide was our first step. The second setting our work field to 75 mm by 25 mm (the size of a microscopic slide). Third making sure our step sizes would allow us to at least move accurately within steps of 0.01mm on both axis. Once done, our workspace in the software should align our stage workspace of the microscopic slide.

X - Y Workspace in Light burn showing the microscopic slide 75mm x 25mm with a drawn circle to be “drawn”.

By doing this simple step we are able to achieve the following movements of the microscopic stage. First fast movement is the “white space travel” since we are moving from our home position (bottom right) towards the drop that's in the middle of the slide. Once at the right coordinate it will start to draw a the circle above.

X-Y axis being controlled resulting in the table / slide moving.

Next step would be to interface with the second controller with a second version of Light burn, to control the Z axis to do focus ramping and the A axis to be able to do the control of the DIC “color shift” movements. Important for both was to find out maximum allowed travel to make sure none of the microscopes hardware would be damaged in the proces.

For the Z axis that controls the focus it would mean we would only allow it to focus ramp for a few steps “up or down” just enough to create a blur, speed would take care of the smoothness of transition.

For the DIC “color shift” this was a bit more complicated. We needed to figure out how many micro steps (1700) it would take to go from the beginning of the DIC prism to the end of the prism. With this value we could “soft limit our movement in the software and start to figure out ranges of color spectrum.

Bias Retardation from 10 to 1: Example of showing the full range color shift of the DIC prism.

With the information of micro steps of the full range DIC color spectrum we can narrow down specific color spectrums we want to use. For example if we want to record a tear of sadness and look at the Plutchik’s Wheel of Emotions we see green to blue ish tone. Finding this tone spectrum we can then shift with different speeds back and forward between soft green and blue creating a different unique cinematic effect.

Above you can see a small color shift with a low turning speed of the step motor driving the DIC prism.

With the setup “ready” we can now combine the 3 most important things for our Cinematography and do a test the techniques X-Y + DIC together. Focus racking will only be used in rare occasions, and therefore be left out in this next test fase.

X — Y stage movement together with DIC prism full range color shift

Time to experiment

The positive outcome of these test gave me the confidence to start learning how to use this new system in live situations. It was time to get some of the preserved tears out of the tear database and start experimenting and learning about the system its possibilities and limitations. But the time to experiment and to get a basics of this new system was short, in two days the IDFA commercial would recorded with this system and the unique tears and stories that were donated for this project needed to be visualised perfectly.

The process of the “Campaign tears”

To bring the campaign to life, hundreds of tears were collected with the Tear Collection Kit's that where send out by IDFA & WeArePi. Asking people to capture their moments of real emotion, Each representing a unique human experience, ranging in emotions from joy to sorrow, pride to heartbreak.

All these tears and stories where carefully sorted out by emotion and matched with a color in the Plutchik’s Wheel. Once all sorted it was time to follow Imaginarium of Tears it's main SOP of preparing the tears (the images below give a quick overview of these steps).

An overview of steps that are normally taken before tears are placed and recorded under the microscope.

Normally this process would be done in bulk with all available tears that are turned into a unique work of art. Only with cinematography we can’t process them all at once, since we want to capture its full process from liquid drop to solid crystallization. Therefore all tears would be processed up towards step four and stored at room temperature. This would now be our "IDFA tear database".

Step five and six of the SOP would be executed once ready to be recorded under the microscope. The first step of this process starts with matching the DIC “color shift” with the given colour that we matched with the emotion of the tear by using Plutchik’s Wheel. Once the microscope and its control system is set we can start with proces five and six.

Once a slide with a teardrops was placed under the microscope its all about "the long waiting game" of a tear to crystallize. The crystallisation process usually takes 5 tot 30 minutes before it starts, depending on the different variables such as humidity, temperature and your unique physiology.

"Unfortunately" you can't time or estimate the duration of the first "nucleation" of a tear crystallizing. Naturally it was therefore a time consuming process where persistent focus was needed.

At any moment in time, in a split second the tear could start start crystallizing. Once the crystallization happens many things need to happen all as quickly as possible. Start the camera recording and the movement of the DIC "color shift". At the same time an educated guess on the crystallisation speed and direction needs to be made and set in the software that will control the movements of the microscope stage. Hoping its direction and speed will match the initial educated guess of the tears crystallization direction and speed.

Over all a huge challenge that needs to be overcome every time a new tear is placed under the microscope to be recorded. But overall it was more than worth the time, efford, focus and dedication to get this result and I can't wait to take on other challenges like this one in the future. Especially when I look back to the collaboration and the end result that is shown above in the introduction of this story.

CREDITS

Client: IDFA
Agency: We Are Pi
Music & Sound Design: Antfood
Colour Facility: De Grot

If you enjoyed reading this, please click “Recommend” below.
This will help to share the story with others.

--

--

--

Visualising the everyday things we encounter and consume in our daily life.

Recommended from Medium

Highlighting a selection of photographs Clint Murchiso

Hi, My name is Billy The Photographer

Queer Space: Duane Michals’ Photographic Series “Things Are Queer”

What’s Different About Travel Photography?

Landmark buildings reflected in the  waters of the Moskva River  on a warm  summer evening  in  Moscow, Russia .

Choosing the Right Memory Card for Your Camera

What is a Histogram in Photography and How Can I Use it To My Benefit?

15 of the BEST Places to Find Stock Photos

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Maurice Mikkers

Maurice Mikkers

Independent Photography Professional from The Hague (The Netherlands) http://www.mauricemikkers.nl #Micrographs #Science #Tech #Art #Creative #Concepts

More from Medium

Thoughts and Enquiries

The Apex of Nostalgia

Is There a Path Beyond?

Yoga and meditation classes do not conflict with Kuwaiti civil law!

An article published on Alanba.com.kw