Project ‘WildTrackAI’ Applies Computer Vision to Animal Footprint Classification

WildTrackAI is the Summer 2020 winner of the Hal R. Varian MIDS Capstone Award.

Berkeley I School
Aug 31, 2020 · 7 min read

Species are going extinct at nearly 10,000 times historical rates. Climate change, development, and illegal poaching are among the primary factors that are negatively impacting our natural resources, increasing human to wildlife conflict, and contributing to overall biodiversity loss.

Improving the monitoring of wildlife behavior can make us better equipped to protect endangered species, reduce biodiversity loss, and contribute to sustainable coexistence between humans and wildlife. With this goal in mind, MIDS students Jonathan D’Souza, Jacques Makutonin, Dan Price, Michael Reiter created WildTrackAI, a tool that applies computer vision technology to enhance the speed and accuracy of animal footprint image classification.


Michael: In the development of this project we partnered with WildTrack, a nonprofit organization dedicated to the protection of endangered species. WildTrack applies non-invasive wildlife monitoring techniques based on indigenous knowledge to track endangered species using footprints. Within this context, WildTrackAI applies cutting-edge computer vision technology to enhance current wildlife tracking techniques in the speed and accuracy of footprint image classification, delivering an integrated end-to-end solution for footprint tracking.

Dan: WildTrackAI is an extension of work developed by the WildTrack organization. WildTrack had noteworthy success developing a Footprint Identification Technique (FIT) to identify individual animals from images of footprints with a high degree of accuracy, but the process is time-consuming and requires expert knowledge to implement. A connection was established between the I School and the WildTrack organization to see if deep learning could help enhance and accelerate their current capabilities as well as broaden their potential user base.

Jonathan: 1) The novelty of the task at hand — to our knowledge, no one had applied machine learning techniques to determine species and individuals from images of footprints; and 2) The mission — knowing that we were contributing to a global initiative to conserve wildlife and reduce biodiversity loss.

Footprint Identification

Michael: We started collaborating with WildTrack in a previous course (Data Science W251: Deep Learning in the Cloud and at the Edge), developing an initial proof of concept to test if computer vision technology could be effectively applied to classify animal footprints by species and individual. This collaboration began in March 2020 and took roughly 6 weeks to complete (mid-April). The proof of concept successfully simulated the capture and classification of an image on an edge device, which then was forwarded to cloud-based storage and displayed on a basic front-end interface.

Front-End Display

Jonathan: When we picked up the project again in May (with Capstone), our initial focus was to outline a tentative target scope for an end-to-end product. As part of our initial research, we interviewed 10 potential users (primarily conservation biologists and wildlife trackers) in various parts of the world (from Israel to Namibia to the Brazilian Amazon). These conversations helped us hone in on a prioritized set of use cases and user personas on which to focus our product concept. We worked quickly to establish an alpha minimum viable product (MVP), integrating a third-party image and data collection platform, cloud-based inference, and a revamped front-end display.

Once we had the MVP working as an alpha (around 5 weeks in), we shifted to a weekly sprint rhythm where we made incremental enhancements, incorporating new features and automation, and attempting to deliver a production-ready product each week. The discipline of weekly deliverables along with our check-ins served as a way to regularly assess progress and course correct, and helped us progress consistently throughout the term. We were able to make a beta site available to almost 40 users by week 9. This accelerated the feedback coming in and helped us further fine tune the model’s user experience based on what we were hearing.

Dan: There was a time during capstone, due to COVID-19 travel restrictions, where GitHub commits to our repo came from four continents. I’m in North America, Mike lives in South America, Jacques manages to roam various spots in Europe (from Luxembourg to the Ukraine), and Jonathan was in Asia before he managed to get back to the states. Calendar coordination for meetings was a chore in itself. Technology and a liberal sleeping schedule helped lower the barriers. We lived between Slack, Google Drive, Zoom, and Asana to manage workflows and maintain communication. But it was perhaps more fascinating how we managed to self-organize in our roles and seamlessly blend responsibilities across our entire technology stack. Jonathan’s history for project management and acumen for software and model development were core. Jacques’ agility for moving between modeling and software helped us reach even our stretch goals. Mike’s ability to liaise with stakeholders for data modeling while simultaneously handling back-end development came in clutch. And my software background helped me navigate the front and back end.

Jonathan: The tight collaboration with our partners from WildTrack, Zoe Jewell and Sky Alibhai (based in North Carolina, so add a 5th timezone to the mix), was a unique dimension that was critical to the success of this project. We had weekly check-ins with them throughout the course of the project and they had access to and were active on our Asana dev boards/ task lists as well as on Google Drive.

Integrated and Automated Data Flow

Jacques: The project incorporated all aspects of the I School curriculum. In retrospect, the coursework for the degree felt like a crescendo of conditioning that culminated with Capstone. The highest credit goes to Data Science W251: Deep Learning in the Cloud and at the Edge. This course introduced us to the project and WildTrack organization through a connection established by Darragh Hanley, one of the course instructors. We used valuable teachings from quasi all classes: from W201 on research design, W205 and W207 for data engineering and ML principles, W251 for practical applications of deep learning, and W209 for data visualization. Finally, Data Science W210: Capstone provided extremely valuable mentoring and allowed us to utilize and consolidate what we had learned on a concrete project.

Jacques: We are all looking forward to continuing our support to WildTrack to effectively transfer knowledge and for future developments. We are currently gearing up to assist WildTrack in presenting our work to potential funding sources. On a personal level, each of us looks to generalize the framework/pipeline we developed with WildTrackAI to broader applications. Jonathan will be working on turtle tracking using imagery of shells. Mike will be exploring how to apply the framework to the agriculture space through pest and disease identification. Dan will be working on developing real-time edge detection of various forms of cryptic ground evidence. And I will be working on trail detection models deployable on drones as the exploration and identification of footprints in the first place can be very time-consuming. Leveraging drone technology for this task will be a game-changer.

Michael: The specific problem that this project attempts to address is whether cutting-edge computer vision technology can enhance the speed and accuracy of animal classification based on footprint imagery. By improving the monitoring of wildlife behavior in this way, we are in a better position to protect endangered species, reduce biodiversity loss, and even reduce the risk of disease transmission to humans, which has become increasingly relevant in recent months.

Dan: In the course of developing the project, we identified several personas that we were seeking to serve. First and foremost, we wanted to improve the process for the team at WildTrack who administer the program. WildTrack needed a centralized and automated system to manage footprint images and AI models as well as a mechanism for providing feedback to field-based users. Secondly, we wanted to create a tool that engaged the wildlife researchers that contribute images and data to the project from locations across the globe. Finally, we wanted to create tools to draw in support from novices, citizen scientists, and the general population through footprint classification on an edge device. Ultimately, we saw this as a tool that had the potential to serve the entire planet by maintaining biodiversity.

Dan: We owe a deep debt of gratitude to Zoe Jewell, Sky Alibhai, and the WildTrack organization for their tireless efforts in their mission to study and protect wildlife. Developing a partnership with an established organization for our course project and capstone provided a unique and practical learning experience. We hope that momentum continues for future partnerships between WildTrack and the I School as there is much work to be done in this domain. We also highly encourage establishing relationships with non-profit organizations whose missions would benefit from data science.


Voices from the UC Berkeley School of Information