How we mapped mobile data in South Bend

Patrick McGuire
Innovation in South Bend
11 min readMar 21, 2023

Trash trucks, 30 phones, and a pilot to assess network strength in our city

As the City of South Bend’s Civic Innovation team works to better connect residents in our community, we need to first understand residents’ current experiences across connectivity options. This understanding is especially critical as we prepare for historic investments in internet infrastructure and assess where we should target this funding in our city.

The new FCC National Broadband Map offers important data on advertised speeds for the fastest data plans available to consumers. In addition to participating in the challenge process to improve the accuracy of the FCC map, our team has been collecting data at the community level to better assess the speeds and performance residents see every day. For home broadband, this has involved a home internet speed test and survey campaign to measure internet speeds and costs. For mobile broadband, the type of internet that 12.4% of South Bend residents rely on exclusively (2021 American Community Survey), we took a different approach.

While we can’t analyze a resident’s home network without being inside their home, mobile networks are publicly accessible. All you need is a phone with a data plan and you can perform rigorous, repeatable tests for any wireless carrier.

With this in mind, in the late spring of 2022 we turned to Dr. Monisha Ghosh and Dr. Nick Laneman, faculty in Notre Dame’s Wireless Institute and members of the South Bend Connectivity Coalition. In her previous roles as a faculty member at the University of Chicago and Chief Technology Officer at the FCC, Dr. Ghosh pioneered innovative methods of testing mobile networks. She worked with a former student to launch SigCap, an app that passively measures cellular and Wi-Fi signal type and quality. She also worked with the FCC to pilot the collection of mobile data on US Postal Service vehicles. This pilot showed the great potential in using vehicles that reach almost every address in our communities to conduct mobile data testing. This ubiquitous testing would ensure that all parts of a community, especially areas with lower incomes that may see under-reporting in speed test surveys, are represented with meaningful data.

As our team considered similar assets we might be able to leverage for a pilot in South Bend, we realized that the City deploys its own vehicles to nearly every address in South Bend on a weekly basis: our solid waste trucks. Thanks to the participation of the City’s Solid Waste and Public Works teams, we would be able to test mobile data speeds at almost every residence along our trash truck routes.

Initial Deployment + Troubleshooting

To capture the current state of South Bend’s mobile networks, our team worked with Dr. Ghosh to develop this approach:

  • We deployed 30 Android devices (all 5G-capable Google Pixel 6 models) across South Bend’s 10 trash pickup trucks.
  • A test kit with an AT&T phone, a Verizon phone, and a T-Mobile phone was installed in each truck. The kits were housed in a 13-inch hard shell laptop case.
  • The kits would be deployed on the routes for a minimum of 12 weeks. As trash trucks would cover the entire city each week, this would allow us to gather 12 data points per residence in South Bend.

More specifically, we set up our phones as follows:

  • We equipped each phone with an unlimited 5G data plan, the FCC Speed Test App, SigCap, and MacroDroid, an app which uses macros to run and export measurements from SigCap and the FCC Speed Test App on a daily basis.
  • The FCC Speed Test App would perform a speed test every minute during trash pickup times (6 AM to 4 PM), Monday through Thursday, to capture download speed, upload speed, and latency, as well as information about network technology and signal quality.
  • SigCap would run continuously in the background, measuring signal strength, connection type, and other key network variables.
  • The phones would be plugged into the trash trucks’ standard accessory ports for constant charging.

With this strategy, we planned to cover all of South Bend with mobile speed tests by the early fall of 2022.

The outside of one of our testing kits
The inside of one of our testing kits
A testing kit (circled) deployed on a solid waste vehicle

Following successful initial tests, we prepared and installed the 30 phones and 10 kits.

After several weeks, some phones were uploading up to 100 daily tests, but others were not uploading at all. As it turns out, many of the phones were powering down during the day for different reasons:

  1. There was insufficient power to keep the phones charged. Though the phones were charging, the trucks’ accessory ports were often not providing sufficient power to replenish the battery above its depletion rate. Speed tests are battery-intensive, and an accessory port provides much less power than a standard outlet. Moreover, we were often splitting the accessory port’s power across three phones.
  2. Phones were overheating in summer months. Many of the trucks in use had no air conditioning, and the phones in these trucks often shut down in the summer heat.
  3. Phones were running when trucks were not. Trash trucks often need to go in for repairs or maintenance. When this happened, the phones would be unable to charge and their batteries would drain. Additionally, summer holidays were another challenge. If a phone was programmed to run on Monday, but that Monday was the 4th of July (a trash holiday), the phones would run tests in a powered-down truck, depleting the battery in a few hours.

To address these challenges, our team decreased the frequency of our speed tests — from once every minute to once every 6 minutes — to reduce the risk of battery depletion and overheating. We also adjusted our macros to run tests only when the phones were charging and, by extension, the trucks were powered on.

These adjustments and the arrival of cooler weather helped the phones stay powered on for longer and, by the end of 2022, we had completed 45,619 tests across all of the phones.

Test Numbers + Geographic Distribution

Our team began reviewing the results by mapping each test from the FCC Speed Test App in ArcGIS, a geospatial analysis software. To make for easier comparisons across parts of the city, we summarized our results in small geographic areas — 0.15 square mile hexagons throughout South Bend. By taking an average-of-averages of data within the hexagons, we could also produce citywide figures that wouldn’t be skewed by a large number of tests in one part of the city. This skew-resistance, it turned out, would be critical based on our testing distribution.

Small hexagonal areas we generated for localized analysis

Reviewing the initial data, we soon realized that the number of tests were not common across carriers or across town. By the end of 2022, we had conducted the following number of tests for each carrier:

  • AT&T — 10,311
  • Verizon — 17,149
  • T-Mobile — 18,159

We also noticed that some areas of town had few or no tests, while some blocks had more than a thousand tests. Our team has a few hypotheses to explain these differences:

  1. Trash routes did not cover the entire city. This was, in part, because some areas simply have more trash pickups than others — industrial areas, for instance, saw very few tests compared to residential areas.
  2. Phones on slower networks may not have completed as many tests. The number of tests performed in a day of speed testing depends on the strength of the network being tested. The FCC’s Speed Test App works by measuring how long it takes the device to download or upload a set amount of data. If the entire amount cannot be downloaded or uploaded within an 8 second period, the test stops. Other steps in the testing process have a similar time-out setup. This testing procedure means that tests will take longer on slower networks, and those networks will see fewer tests completed.
  3. Phones on weaker networks may have failed to upload, or even start, their tests. Additionally, phones operating with slower connections may have been unable to properly begin the testing process or successfully upload their results at the end of the testing day. Our team was able to manually upload tests on several phones that failed to upload automatically, but these failures may point to broader testing difficulties during our deployment period. Carriers with faster test results citywide saw a wider geographic testing distribution and a greater number of tests on average, suggesting that network strength may have been a determinant of total number of tests.

Regardless of what caused these differences, our team wanted to cover as many areas of town as possible with meaningful test results across all carriers. Therefore, we conducted a series of additional tests in targeted neighborhoods and industrial areas.

Additional Drive Tests

To gather additional data in areas of town that saw few or no tests, our team configured a standard car with a similar test setup as in our solid waste trucks. This setup, however, included 6 phones and a 300W power inverter for additional charging capability. The power inverter allowed us to charge 6 phones at a time while increasing the testing frequency to 30 seconds. We also uploaded the tests manually with a strong connection to ensure there were no failures.

After conducting approximately 3,500 additional tests over the course of a month across all carriers, our team had covered nearly every 0.15 square mile area of South Bend. Importantly, though, the areas we covered with these additional drive tests still see relatively few tests compared to other parts of the city. Results from these areas should be interpreted with more caution than those with a large number of tests.

Results

Following these drive tests, we again analyzed our results in ArcGIS, summarizing the results by carrier in each hexagonal area. We focused our analysis on three key variables: upload speed, download speed, and 5G availability. We also provided a count of tests conducted in each area for context.

Importantly, our 5G availability numbers should be viewed with particular caution. We calculated these results based on the network technology reported at the start of a speed test in the FCC app. However, preliminary data from SigCap has shown that the technology reported in the speed test does not capture the full picture of the technology being used. For instance, if a test is reported as taking place over 5G, it may be combining signals from a single 5G radio with multiple LTE radios simultaneously. Our partners at ND Wireless plan on exploring these results further, but we encourage viewers to treat our findings as a helpful, though limited, starting point.

The full results can be explored in this interactive app, and we share key citywide estimates of our results below.

A screenshot from the interactive app where you can explore our findings

We produced these figures by averaging the mean download speed, upload speed, and 5G availability percentage across all 377 hexagonal areas in South Bend for each carrier. To summarize the number of tests per area, we estimated with medians due to a strong rightward skew (some areas had many tests, while the majority had less than 25 per carrier). We excluded any areas where the carrier had no test results in these calculations.

Download speed (Mbps, Megabits per second):

  • T-Mobile: 89.98
  • Verizon: 32.05
  • AT&T: 23.13
  • All carriers: 56.65

Upload speed (Mbps):

  • T-Mobile: 13.11
  • Verizon: 9.59
  • AT&T: 8.91
  • All carriers: 11.20

5G availability (average percent of tests conducted over 5G):

  • T-Mobile: 21.48%
  • AT&T: 15.59%
  • Verizon: 6.25%
  • All carriers: 14.41%

Median number of tests per area (excluding areas with no tests):

  • T-Mobile: 24
  • AT&T: 12
  • Verizon: 14
  • All carriers: 57

Next Steps and Lessons Learned

The data we collected in this project provide a strong foundation for understanding the capabilities of our city’s mobile networks. For the 12.4% of residents who rely exclusively on mobile data for internet access, this data gives a clear view of the performance they are experiencing. Depending on the carrier, in some areas this performance falls below the FCC’s definition of broadband: 25 Mbps download and 3 Mbps upload.

These results show the potential for continued enhancements in local carrier service offerings, including more 5G buildout, while also showing the need for more public and private broadband connectivity options. In addition to the connections provided by mobile networks, our team is working to expand public internet access, high speed consumer broadband availability, and uptake of the Affordable Connectivity Program. We are eagerly seeking partners in this work, and will continue to gather and refine our understanding of residents’ experiences and needs through data collection efforts like this project.

We also encourage other organizations and municipalities to explore similar testing initiatives in their communities, leveraging assets like trash trucks, buses, delivery vehicles, or street sweepers for constant testing. We also hope that this model be applied to other forms of testing, such as environmental sensing. At a high level, we share four recommendations for any organization interested in conducting a similar study:

1. Ensure you have sufficient capacity to troubleshoot devices.

Regardless of your testing setup, you will have issues that need diagnosis, troubleshooting, and reconfiguration. For our small team, this became a challenging burden. However, a team of student researchers or interns would have boosted our capacity to keep tests constantly running, making for a more effective and efficient study.

2. Check on your testing distribution early and often.

While our testing distribution (both across carriers and across town) was a challenge we weren’t able to address fully in our study, an earlier intervention would have allowed us to course correct. By waiting to review our distribution until the end of our testing period, we missed the opportunity to troubleshoot the cause(s) of our inconsistency and implement potential solutions.

3. Consider using a power inverter.

Limited charging power was a critical challenge in our setup. For our study, we decided against using a power inverter, which can pull more power from a standard accessory port, due to worries about draining a truck battery and potentially disabling it. However, power inverters can be configured to mitigate this concern, and a knowledgeable team can likely use them to avoid phone outages.

4. Get in touch.

Our team has learned quite a bit during this process, and we’re eager to share our experiences with organizations looking to do similar studies, or with anyone interested in our methodology. Please don’t hesitate to reach out to us at connectivity@southbendin.gov. We would be happy to discuss our process and any questions you may have.

We believe that this mapping is critical to ensuring all members of our communities have the connectivity resources they need to thrive, and we would be thrilled to work together to pursue this goal.

--

--