Teaching Google Earth Engine

Reflections on developing a workshop series

Erin Dawn Trochim
Geospatial Processing at Scale
4 min readOct 21, 2020

--

Overview of a Google Earth Engine workshop series

Good teaching benefits from awareness of both of yourself and the people you are trying to teach. Learning new knowledge and skills is personally challenging. Teaching can be even harder as it introduces other people. Their learning becomes the main target. You need to be able to adjust your approach and understand where and when pivoting is necessary.

All this is great. You know something. You get really excited about it. Other people want to learn it. You teach them. It goes awesome.

That’s the ideal.

Teachers are made, not born. I start by thinking in-depth about the messages I’m trying to convey. Then I imagine the end of the session. People will want different things than me. What should they take home? And what are they likely to ask?

Translating this to a subject like Google Earth Engine (GEE) is extra tricky. GEE spans a variety of realms. There is the technology, the underlying science, and the socio- considerations of how the information gets used. The key is to identify the interests and familiarity of your students in these areas. And then make it engaging.

With all this in mind, I decided to host a three-part GEE workshop series. Rather than subjecting others to my same two-year journey, this was my opportunity to make it **better**.

I chose to frame the series around generating strategic insights. First was how to create problems in a big data ecosystem. The main message was that it was no longer business as usual, GEE had played a big role in democratizing access. Next was how to apply functions and simple classifications. Our focus was picking the right tool for the job. And understanding how different types of data could make your job easier and harder. Finally, it was visualizations. Using form and function, the emphasis was on our audience explaining it rather than us.

The format was an hour lecture followed by a two-hour lab with time for questions. This was a common format for my participants, who were primarily graduate students, researchers, and professionals. There were a few people who requested more lab time and examples, but overall everyone was satisfied. It was offered both in-person and remotely over Zoom until COVID-19 meant that the last session was remote only.

Participants attended from a range of organizations. As I’m based in Alaska and it was supported by the Alaska Climate Adaptation Science Center, our participants included those with a climate and/or Arctic focus from universities, state agencies (Alaska Fish and Game, Alaska Division of Geological & Geophysical Surveys), and federal agencies (Department of Defense, USGS, US Fish & Wildlife Service, National Park Service and Bureau of Land Management). The only requirement was a basic knowledge of geospatial principles, like knowing how to use a map. Each workshop would build on the next, and the latter two would spend more time going over code. But if you really didn’t want to code, you could follow the concepts and still understand workflows.

The lectures were designed to take complex material and reduce it down to key intractable concepts. I termed this “Things you really need to know”. And there were about four or five things every time. It was important to provide a conceptual framework for why these were highlighted, and then follow it up with a research example from either myself or the graduate students I’ve worked with. This helped illustrate the speed-bumps translating theory to real applications.

Then the fun would start in the lab. The focus was on building up those real skills that make up projects. Starting with knowing how to find data and then look at it. The GEE data catalog has code examples for every dataset, I’d pointed out how useful this would be at the user summit in 2018 and they magically appeared a short while later. The first lab focused on filtering and reducing, culminating in showing how to calculate monthly fire areas in Alaska using the simplest app I could design. This was strategic, as I went over how to build it in the final visualization session. In the second lab, it covered how to apply functions and simple classification like thresholding. Rather than leaping into the deep end of image processing and classification, we started with basic examples which are a key foundation for mastering complex techniques. The trick to all of this is to understand how the data structure interfaces with the process. And how the structure is relatively consistent between datasets which makes it easier to modularize and reuse processes.

Overall, it was a super rewarding experience for me to offer the workshop series. Everyone was excited. The people that were more familiar with coding picked it up faster. I’m sure some will be happy to keep their interactions with GEE at knowing how to look up data and ask for apps to be built for them. And that’s ok. The participants could see how to start rethinking their problems out of the old paradigms and into the new ecosystem. And since that was really my main learning objective, it made it awesome.

My big reflection was how critical it was not to make complicated things more complicated before you could really understand why that might be useful. Especially in remote sensing, there is an established pattern for teaching material. Re-examining this pedagogy is important for considering how and where diving deeper is useful. Developing synthesis skills for readily exploring problems creates more opportunities for digesting abstract concepts.

Big data and cloud computing need to be accessible. This means a large community generating and using information where teaching is a critical piece of knowledge sharing. And learning makes you excited about what you are going to solve.

--

--