California recently emerged from a historic, four-year drought, thrusting the state’s system of water supply monitoring into the spotlight. Where does California’s water originate? Mostly the mountains, in particular, the Sierra Nevada range. The snowpack accumulates in the winter, then melts and flows into the Owens Valley, where it is carried by aqueducts to Southern California. The state of California tracks the depth of this snowpack carefully via a network of sensors, and occasional manual measurements by snow scientists. Unfortunately, these sensors are pretty sparse, leaving wide swaths of snowpack unmeasured.
Here’s what a sensor looks like:
They are essentially scales that weigh the snow resting on them. There is no accounting for varying densities of snow, though the nominal density is usually 11 inches of snow for 1 inch of water. Thus, these don’t really measure inches of snowpack, but inches of water equivalent, with water being constant density. These sensor locations are marked with the red pins below. The question mark ellipses are just a couple regions where sensors don’t exist, but where there is plenty of snow.
Can we do better to fill in the gaps between snow sensor placement and the infrequent human measurements? What if we just used NEXRAD weather radar, which has coverage almost everywhere? With machine learning, it may be able to infer snowfall amounts better than physical modeling. We could use this not only for water supply monitoring, but also for things like planning for backcountry recreational trips.
A brief discussion on weather radar. The primary metric returned by radar is reflectivity, that is, the radio waves reflected back to the radar antenna. What weather phenomena reflect radar? Mostly clouds and precipitation. Weather radar tells us where there are clouds, and can also give us information on the type of precipitation (light rain vs. heavy), if any.
First off is getting the data. There are two sets of data needed: ground snow measurements from sensors, and the weather radar data giving atmospheric conditions. The California Department of Water Resources makes public their snow sensor data. Here’s what one day looks like. In order to get many days, I scraped the site using Beautiful Soup. Plotted up, eight random snow sensors look like this:
There was a lot of accumulation last winter! And some great skiing…
For the radar, I was dependent on the NOAA Big Data Project. With Boto, I was able to download 288 radar scans from the KHNX radar station in Southern California. These covered the two-week period 15 January through 8 February from this winter. The radar itself is located in the city of Hanford, which is about midway between Los Angeles and San Francisco. I selected it because it appeared to have the best coverage over the Sierra Nevada. Scans were taken at approximately 2-hour intervals. The data totaled 3.3 GB on disk. Due to rate limiting and coding hiccups (watch out for that Python garbage collection), this took about two days to collect.
A single radar scan’s reflectivity looks like this (generated with Py-ART):
We can see a large storm over the Sierra Nevada from what amounted to a phenomenal ski season. As you can tell, the radar sweeps over a large area. In fact, it’s volumetric; the radar antenna scans at specific altitudes, in addition to azimuthal directions. All of these data are contained in the scan files. You can extract values from a specified latitude, longitude, and altitude. Because we are not forecasting, I considered values interpolated over a 5 km horizontal square above each sensor, at five altitudes: 2,000, 3,000, 4,000, 5,000, and 6,000 m. I used the Py-ART module to extract these values, and let run it over multiple days in stages, totaling around 40 hours of processing time.
What values are we looking at? Just four: differential phase, reflectivity, velocity, cross-correlation ratio. I settled on these based on a Kaggle competition on rain prediction from radar, which parallels this project.
With all these radar and snow data extracted, we have the task of merging the datasets in some way. Basically, how it looks is that for each type of radar value and altitude (e.g. reflectivity, altitude 2,000 m), we have a 1-D time series over the two week period of interest. Radar scans were fetched at approximately 2-hour intervals, so I just compared the time of the radar reading to the times of the daily snow readings. Any radar reading occurring within 24 hours of the snow reading (which are taken at 8 a.m.) was associated with that sensor reading. In effect, we break up a two-week time series into 24-hour chunks, that line up with the daily sensor data.
We feed this into XGBoost and get results. With the first run, it became clear that the dataset was very skewed. A majority of the snowfall amounts were very close to the same value, so the machine was “learning” to predict just this value to get a low error rate. Extreme values were being ignored because they were too few. On the plus side, tree-based models tend to handle such imbalances well, though it may take some parameter tuning. I also changed the error metric to mean squared error (previously mean absolute error), which further amplifies the magnitudes of larger disparities between predicted and expected values. After some parameter tuning (mostly via GridSearch), I arrived at a fairly accurate model. Below is the average results for all 31 sensors analyzed.
This was more accurate than I anticipated, working from radar alone. There is no temperature here, and also no corrections for certain radar artifacts. A few things jump out though: the predicted value is always below the expected value, and there is greater error near the middle of the domain. The results from the individual sensors seem to show the model misses short-interval bursts of snow, but does well when the accumulation is gradual. Still, we can see that we can predict snow reasonably accurately with radar and terrain info. This can then be applied to any lat/long where we have radar data, so that we can fill in some gaps left by the sensors!
Feel free to connect with me on LinkedIn.