Understanding Our Users Through Analytics

Taylor McGinnis
NYC Planning Tech
Published in
6 min readJul 25, 2018

Last week, shortly after Planning Labs’ first birthday, I was tasked with combing through the Google Analytics attached to five of our applications:

Comparing these apps would prove more challenging than I had initially anticipated. While each app has a specific goal, quantifying these goals with data requires much more than just comparing numbers. Before diving into the figures and charts that Google Analytics offers, I made a list of each app’s purpose and intended audience. Losing sight of each product’s distinct objectives could lead to misunderstanding what analytics data is useful, and what data is better left alone.

Audience and purpose of each Labs App

I broke up the analysis into three main parts:

  1. An overall breakdown of our users over the past several months
  2. How users were able to find the apps (acquisition sources)
  3. How users were navigating through the interface (bounce rates, exit rates, and events)

Users

The apps had a rough overall range of 50 to1,200 daily users, and 700 to 16,900 monthly users. ZoLa (the most well-known and intricate Labs app) had the most monthly and daily users, at 16,857 on average per month and 1,122 on average per day 🎉. In recent months, all apps had shown a decline in monthly users. Maintaining a user base for these apps can depend on the app’s purpose and audience, how we publicize it, the ease of which a user can find it (acquisition), and the level of user satisfaction (behavior).

Users per Month for each app

Each app had similar data figures for the percentage of new visitors versus returning visitors (around 70–80% new and 20–30% returning). Whether we want to see a higher percentage of new versus returning depends again on the purpose and audience of the app. ZoLa, for example, is often used in the field to understand zoning for planning purposes. In this case, we want ZoLa to display a higher number of returning visitors, as this means that users are satisfied and using the app often on the job. Metro Region Explorer is more of a story map than a tool, so we expect a higher number of new visitors, as users are less likely to visit multiple times after they have read through its information.

Acquisition

Each app had its own combination of acquisition sources, with referral and direct visitation as the predominant routes. While the Metro Region Explorer had nearly 90% of its referral websites as news websites, the main referral sources for all four other apps were city government websites. Metro Region Explorer and NYC Street Map had direct visitation as their most popular acquisition route, which implied that users already knew about them from other sources. Over 90% of the search input keywords were listed as “not provided” on Google Analytics. This meant we were lacking some useful data about our visitors’ intentions when they began their search.

Acquisition Source

Because there’s a lot of missing information in this process, we may never know where a user learned about the app before visiting it directly or searching for it online. But we can make assumptions about what types of users accessed which acquisition routes. For example, a Department of City Planning employee is more likely to visit the app directly or to click on a link through a government website, whereas a user who reached the app through a news website (e.g. technical.ly or nytimes) is more likely to be a member of the public.

Behavior

While in the process of building an app, we meet with our customers once a week to understand what they think about our progress, and how we can better meet their goals through the Scrum process. After we launch the app, users are encouraged to add GitHub issues noting any bugs or improvements they believe are important. But many users may never communicate with us their experiences of interacting with our products. Because of this, we need to analyze the behavioral data we gather from analytics to understand the routes that users follow as they engage with the interface.

Because we want users to be able to find information quickly, bounce rates and exit rates prove to be better measures of functionality than session duration.

High bounce rates, the percentage of users that exit the app without interacting with the interface, could be due to a number of issues. One possibility is that the app did not immediately meet the user’s needs. This could be more so a problem of acquisition, as the referral or search route that the user followed brought them to the wrong place. Extremely low or extremely high bounce rates can oftentimes raise questions about the validity of the data. One of our customers on the Population FactFinder team pointed out a suspiciously low 0.4% bounce rate on our FactFinder report. We ran the Google Analytics Debugger in our environment.js config and found that some events weren’t being logged at all! A common and important mistake to learn from as we move forward with keeping track of our apps.

User Behavior Data
The FactFinder bounce rate compared to number of users over a 5 month period. The bounce rate has a large gap in data after the initial months.

Exit Rates are defined as the number of exits per number of page views. If a user is visiting Community Profiles, they may not exit the app until they’ve clicked on a community district and viewed its profile. If the user exits before doing so, either by leaving immediately or by only clicking on the “about” tab, we can assume that they either found the app difficult to navigate, or realized that the app did not serve their purpose. Exit pages, the page on which the user exits the app, can be a tricky metric to pull conclusions from, because some people may just use the app to simply “discover” (click around, read about the app, look at the data) while others may aim to download an entire report.

Another important metric to observe is an app’s most popular events. More complicated apps are likely to have more events per session. An example of an event can be toggling a map layer, entering text in the search box, or clicking on a tab. ZoLa had an average of 22 events per session (out of sessions that had events at all), whereas Community Profiles only had 5. This makes sense because ZoLa incorporates more possible events into the app. The most popular event by far was “search input”, demonstrating that users are most interested in finding exact locations on the map.

What to Expect from Analyzing an App’s Analytics

Trying to draw strict conclusions from analytics for data-sharing apps may leave developers feeling at a loss for a precise measurement of their product’s performance. While expecting to quantify progress may be ambitious, working with analytics can certainly help developers to build a better picture of what users might not be able to tell them directly. We can understand our most important app features (map search function), how users access our apps (city websites), and where we may need to improve (if no users are clicking on a button we may need to make it more noticeable or remove it altogether). Keeping track of this data will help civic tech developers better connect with their users’ needs and keep them in mind while continuing to build on and improve their products.

--

--

Taylor McGinnis
NYC Planning Tech

Mapper, web developer, and data analyst focused on city planning issues