Landsat 8, MATLAB, and the California Drought (Part I)

Dan Bridges
Parallel Thinking
Published in
4 min readJan 28, 2018

Here in California we have experienced a significant drought over recent years. Thankfully, last year we finally received a decent amount of rainfall, but before that we had 5–6 years of below average rain.

The main reservoir for the Santa Barbara area is Cachuma Lake, and it was severely effected by the drought. I was interested to see by just how much and thought historical satellite imaging would be an ideal source of data.

Landsat 8 was launched in 2013 and provides a multitude of imaging data, with images taken roughly once a month for a given area. First I looked up Santa Barbara in USGS’s EarthExplorer, and downloaded all of the available Natural Color Images for the area.

These images have a fairly large field of view, so my first task was to crop them to Cachuma Lake, which I did with ImageMagick:

mogrify -crop 800x800+3200+3660 *.jpg

I manually removed images with extreme cloud cover, and was left with 62 cropped images. Here is a sample:

One of the cropped orginal images from Landsat 8.

My goal was two fold, first measure the surface area of Cachuma Lake over time, and then compile the images into a time-lapse movie showing the effects of the drought.

To calculate the surface area of the lake we need to segment the image, and for this I chose to use MATLAB and the Image Processing Toolkit. First we convert to grayscale.

IP = rgb2gray(I);
Grayscale image.

In this example image it would be fairly easy to binarize it directly, but many of the other images were very low contrast, forcing me to perform some additional steps to properly extract the lake from the background land.

With the grayscale image we then calculate the luminescence of the background by morphologically closing the image with a structuring element sufficiently large enough to fully occlude the lake. A morphological closing first performs a dilation, then an erosion, leaving us with an image representing average intensity in each area, but with the lake removed.

IPBackground = imclose(IP, strel(‘Disk’, 25));
The extracted background levels created using a morphological closing.

We subtract the original image from the generated background image to create an inverted image of the lake, with a dark background level. This step is necessary to generalize the algorithm due to the varying contrasts of the input images, and to easily extract the lake using binarization.

IPSubtracted = imadjust(imsubtract(IPBackground, IP));
Inverted and adjusted image after background subtraction.

We then binarize the image at an appropriate threshold, and as a final touch we remove small components that may remain, generally due to cloud shadows creating additional dark spots in our original image (these appear as white patches in the lower right hand corner in the image above). We also fill holes in the lake. I chose to use bwareaopen to fill the holes, instead of imfill because I only wanted to fill very small holes. Occasionally when the water is sufficiently high one of the bends at the end of the lake becomes an island, so we don’t want to fill that in too.

BW = imbinarize(IPSubtracted, 0.85);BW = bwareaopen(BW, 1200); % Remove small components outside of lakeBW = ~bwareaopen(~BW, 30); % Remove small holes in lake
The final binary image, extracting only the lake.

We now have a binary image which we can run bwconncomp to segment and extract measurements from:

CC = bwconncomp(BW);props = regionprops(CC, ‘Area’);area = props.Area;

We run this algorithm on all of our images, compute the areas and view the output. It’s clear the lake recedes dramatically from 2013 until late 2016. Then an impressive amount of rain in early 2017 filled the lake above its 2013 level. Unfortunately, it is now receding again due to our current low rainfall.

In the next part we’ll look into creating a time-lapse movie directly from the original images to visualize the surface area change.

--

--

Dan Bridges
Parallel Thinking

Software developer at Beezwax Datatools and former researcher in Physics & Neuroscience.