Dog Runs and Flying Cars
Dog runs have long been advocated by urban planners for their health effects. Allowing dogs to exercise in a pack setting reduces pet complaints from neighbors, and it gives otherwise anonymous city-dwellers a chance to socialize and build neighborhood bonds without risk or commitment.
More established dog runs sometimes hold an informal weekly ‘yappy hour’ with adult beverages and conversation. Some even have their own web sites. For my dog Bubba, the best experience has been the off-leash area at Brooklyn’s Fort Greene Park after 9pm, where the local canine fashion trend is LED-colored collars and there’s plenty of space to run. The community is real. Everyone’s smiling. Even non dog-owners come to enjoy the vibe.
So I want more New Yorkers visiting dog runs. With or without dogs. They contribute to the city’s collective happiness, if used properly. How to find them? As a software engineer, I was surprised at the dearth of options. So I built Dog Runs of New York, an interactive map of New York City’s dog runs and off-leash areas. As with most worthwhile projects, it was no walk in the park (cheesy puns aside), but I learned tons that will definitely speed workflow on the next mapping project.
As a predecessor discovered, the dog run data provided by NYC Open Data was way too colloquial to be useful without data-shaping. For example, the address was often described in terms of bordering streets, and ‘behind the ball diamonds’. While helpful from inside the parks, while navigating the grid they sound more like a Pepperidge Farm commercial than actual addresses. After trying a few rounds with Google’s GeoCoding API, I decided on reshaping the data in Mapbox Studio while leveraging the mapbox GL-JS library after seeing it action with their store locator tutorial.
GL-JS has excellent documentation and tons of useful examples, including a fun ‘flyTo’ method. The animation is very cool as far as maps go — like a short distance flying car commute. And the feature set allowed me to scale quickly with easy data-shaping and a community of user templates you can borrow and share based on use case, like satellite views and 3D buildings.
Back to the data set and how to clean it up. Mapbox houses its data layer in GeoJSON, a standard file format used for building anything geographic. At a very basic level, it’s a JSON-format file with the below required minimum shaping, and informs the canvas how to visualize its data on a series of vector tiles. Vector graphics are more preferable at different resolutions because they are drawn by data and functions that operate on it. In short, vectors know how to scale without losing resolution.
At this point, I noticed Studio provides the kind of workflow where marketing, product, and graphics arts team members could easily edit store locations, map colors, or icons without breaking anything. Changes can be saved, staged, and published by multiple team members, almost like a git workflow. The style then connects to the live app through a custom CDN link in your project’s main html file.
Back in index.js, to add popups at each location on the map canvas, regular DOM element selection methods wouldn’t work. But GL-JS provides the queryRenderedFeatures method to select elements on the map canvas, which I used to generate a filtered array of just the dog icon features (Other features include roads, streets, tunnels, schools, etc. — recall each point is a ‘feature’ type in our GeoJSON file). Since it’s invoked as the callback on the mapbox mousemove event, it can only be an array of length one dog icon or zero. And because it’s a GeoJSON vector tile element, it contains all the data needed to navigate a jump, zoom, and detailed popup.
So far so good. But scrolling through 128 dog runs still seemed…a drag. Adding a UI menu for selecting dog runs by borough was a natural use case, so I styled them using the color values for the various MTA subway lines and the Staten Island Ferry. I then built out a boroView object with ideal zooms and centering coordinates and flyToBoro and setBoroView functions. So we fly to the perfect zoom and location over a selected borough to view all the dog runs, then filter the map icons, and sidebar location list, for those just in the selected borough.
Then flyToBoro gets called in a ternary expression when we build the navigation panel.
Some notes on optimizing performance: mapbox’s vector tiles allow for sharp rendering at high resolution and smooth animations, but you can easily break them by not triangulating the various zoom methods, which I of course did while planning my flight paths with glee.
So after some testing and browsing the issues page, I implemented these optimizations, which lessened the data viz overload and momentary blank tiles.
1) removed expensive and redundant layers like highway/city labels, visual noise to our user. Just select and click the trash icon.
2) activated ‘zoom functions’ on the opacity of detail layers like street labels (see below). So when zooming down from sky to street view or jumping to a dog run in another borough labels fade in and out in tandem at a logical rate.
3) maxed out the new mapboxgl.Map buffer parameter to 256 after testing both 56 and 128; The buffer the better, I always say.
4)…then ensured the same mapboxgl.Map parameters above included a maxZoom property (18) greater than the deepest set zoom level (16, below) and deep enough into zoom function’s opacity range I specified above (13–22).
So street names that fade in at higher zooms aren’t washed out or transparent when zooming into and exploring a dog park view, and the maximum zoom for the entire map is set higher than the dog park view. (I’m still learning more about how to animate and optimize styles, and will report back in the comments any new findings).
Async pre-loading of vector tiles for camera movement is a known issue, and that showed up in the project as occasional blank rectangular areas when moving from location to location, although the buffer adjustment combined with slower flyTo speed and higher curve settings (both default at 1, see adjustments above) seems to mitigate most of it by zooming the camera up in more of a parabolic curve from the departure point so the tiles around the destination point have more time to load. It’s definitely a workaround hack; I’ll add any learning on this point as it comes.
Plans are to add Google reviews, jump to the closest dog run based on device location, and I’ve already been asked to add New Jersey. (Note to self: Buy domains that scale).