The city in 3D: Using new sensing technologies to improve quality and accessibility of city streets
Well-maintained, walkable streets are essential for cities to thrive: they are associated with lively retail scenes, healthy communities and higher property prices. They’re also essential if you happen to use a wheelchair. But until recently, the quality of a city’s sidewalks was a data black hole. That is now beginning to change thanks to the uniquity of cellphone cameras, and to emerging 3D sensing technologies.
In this post, we will walk through the steps to build a simple condition monitoring system for city sidewalks. Using cellphones and ArcGIS, we reproduced a section of the Broadway Curb-Cut Survey — a major New York City data collection exercise — to generate maps of sidewalk quality. But as anyone who opens their iPhone using FaceID knows, our phones are now going 3D. We trial the use of 3D scanners to extract physical measures of sidewalk accessibility — read on to see how these methods can help drive progress towards thriving and accessible city streets.
“Non-ADA-compliant curb cuts present enormous challenges to pedestrians like me who are blind or visually impaired. The bumps of the warning surfaces, part of an ADA-compliant curb cut, give me vital information about a street crossing. When the curb cut is flat, I have no clue about where the street begins, which is dangerous.”
- Audrey Schading, Advocate for Blind and Visually Impaired People
Give my Regards to Broadway
Running 13 miles across Manhattan from Bowling Green to Inwood, Broadway is an iconic street and a birthright of all New Yorkers. But for the city’s 100,000 wheelchair users, few barriers are more critical than the lack of a usable ramp between sidewalk and street crossing. Indeed, wheelchair user associations have turned to litigation to hold cities like New York to the standards set out in the 1990 Americans With Disabilities Act (ADA). However, data gaps are a fundamental problem: without data on the state of its sidewalks, how can the Department of Transport allocate its budget or monitor whether performance is meeting agreed standards?
To size up this challenge, Manhattan Borough President Gail Brewer organized the Broadway Curb-Cut Survey. Published in 2015, it provides the most detailed snapshot of accessibility along a major urban thoroughfare. Despite some $243 million invested over a 15-year period to improve sidewalk accessibility, more than a quarter of intersections were too steep and nearly 9 in 10 lacked necessary safety features. The survey took some 40 volunteers the better part of a year to complete; could a smartphone app and crowd-sourcing generate this data in a more sustainable way?
The Easy Part: Building a Survey App
Creating a cellphone-based field survey has become an easy task. We use ESRI’s Survey123 tool; its interface should be familiar to anyone who has built a survey with Google Forms. We defined the minimal data requirements considered necessary for the city to understand the state of its sidewalks. Our twelve-question survey instrument (see it here on ArcGIS online) allows multiple field agents to enter measurements and condition grades, snap a photo, and hit ‘geo-locate’ to record latitude and longitude via their phone GPS.
Like any field survey, ours required tweaking after the first trial runs:
- Firstly, team members required standardized definitions for the measurements. An instruction sheet helped standardize measurements of curb height and other features.
- Secondly, we added a classification scheme for overall curb condition. Physical measurements by themselves were not intuitive, we developed a scale based on how many ADA violations were present.
- Thirdly, iPhone GPS precision unfortunately falls short, especially in New York’s urban canyons. Curb records ended up 15–50 meters from their true location. GPS receiver units from firms like Trimble or Bad Elf remedy this and interface perfectly with Survey123; but that’s no solution when crowd-sourcing city data. Instead, we added new fields to manually locate curbs by their orientation and cross-street. With a little help from the OpenStreetMap API and Python scripts, the records can be snapped to their true locations.
Having refined the tool, we surveyed Broadway between Union Square and 26th St.
Let’s Take This Into 3D
Architects and civil engineers increasingly use 3D imagery of cities when planning infrastructure projects — indeed NYU’s Debra Laefer (who advised us on this work) has pioneered high-resolution LiDAR imagery to help Dublin build out its subway system without jeopardizing historic buildings. LiDAR sensors send out laser pulses to measure distance, generating point clouds that (once cleaned up) give a 3D model of all objects in range.
The 3D sensor mounted in the iPhone X works on a similar principle: it projects 35,000 infrared dots to create a 3D model of the user’s face. Note that the 3D modeling community has built increasingly strong platforms to share 3D objects, such as SketchFab. Retailers like Walmart and Ikea are sinking funds into augmented reality apps premised on users being able to scan their living room and check which 3D rendered table of sofa fits best — so we confidently expect longer-range 3D scan capabilities to inhabit your phone within 2–3 years.
How could the miniaturization of LiDAR style technologies help city agencies? To find out, we compare two technologies: a hand-held 3D scanner and an industry-standard terrestrial LiDAR scanner.
3D Scanners: Handheld Versus Ground-Mounted
Occipital, a tech firm in San Francisco, offers a mass-market 3D device titled Structure Sensor for $379. It’s the best and most accessible 3D scanner; we considered it a good approximation for how mobile phone sensors may evolve.
We attached a Structure Sensor to a trusty iPad Mini and used it to measure street safety features. Several software options are used depending on the 3D capture requirements — whether for room interiors (favored by interior designers) or close-range objects (game designers modeling figurines). Occipital’s Capture package worked well for our task: capturing point clouds of street features within the 1–4 meter range.
To provide a high-end technical benchmark for our handheld scans, we used a Leica P40 terrestrial LiDAR scanner. The Leica P40 is serious kit: retailing at around $100,000, it’s a hefty ground-mounted item producing 3D models with a 3-millimeter margin of error at 50-meters range. Uses include producing Building Information Models, validating newly constructed industrial facilities against their architectural plans, or reconstructing crime scenes. More challenging for outdoor spaces is the fact that a single 360-degree scan takes long enough that pedestrian traffic disrupts lines of sight, requiring extra clean-up.
Both methods give us LAS file outputs: this is the standard format for exchange of LiDAR data, containing x,y,z coordinates for each recorded point. Cleaning point clouds involves shaving off errant sets of points that might result from a pedestrian or a bird passing in front of the sensor; we used CloudCompare (an open source package) for this purpose. Click the video for our Leica 3D scan taken at 26th and Broadway.
With our streamlined survey instrument, mapping Broadway from Union Square to 26th Street could be completed in an afternoon, producing a web map that draws quick attention to trouble-spots. Users can pull up their choice of query, for example, all curbs with multiple ADA violations north of 23rd St.
The stretch has some excellent pedestrian crossings — of the type associated with inclusive streets and thriving retail — but other stretches where access is hindered by crumbling concrete, potholes and puddles. The output allows for quick custom queries: for example, show location and photos of all curbs where lip/bump height exceeds 15 centimeters.
Of most interest, the Structure Sensor scans were accurate enough to extract physical measurements back at base. Utility companies increasingly use LiDAR-based models to take physical measurements — for example, check elevation under a bridge or power lines — without sending field staff to the location. In CloudCompare, we manually extracted distances and slopes using the tape measure function. Key measurements for accessibility of our streets — such as ramp width, slope and lip/bump height — can indeed be generated using handheld devices. This is interesting in part because we found the process of measuring curbs to be repetitive and, frankly, slightly embarrassing. Shoppers jostled us and residents raised eyebrows as we crouched by the Flatiron Building with tape measures. Snapping a 3D picture from a future iPhone, by contrast, raises few such qualms — a fact that could help agency staff or volunteers verify whether accessibility features are in good shape.
The Way forward: Cities in 3D
Data on the physical form of cities is extremely hard to acquire. It’s a key challenge because city agencies worldwide are struggling to link their annual expenditure cycles to objective data about asset condition: whether that’s the state of public buildings, condition of road surfaces, or quality measures at parks.
We found that a smartphone-based survey could quickly help prioritize interventions across a stretch of Broadway — but the man-hours needed to collect this information are still prohibitive. 3D imaging could fill that gap. Indeed, we’re noticing a small research arms race in this area. University researchers and start-ups are exploring ways to mount LiDAR sensors on vehicles: a method that can be applied in cities globally.
While we extracted measurements from our 3D models by hand, the key challenge for scaling this approach is automated feature extraction using machine learning. This is a tough and computationally intensive challenge but has potential to change large industry sectors — such as insurance claims adjusting — as well as helping cities target maintenance and investments better. New metadata formats to combine LiDAR or streetview imagery with data on vehicle position and camera orientation promise to make exploitation of these sources easier. Our method, meanwhile, could be incorporated by city departments to improve field data collection— or by civic groups for crowd-sourcing campaigns. Just one way in which the future of city data is in 3D.
This blog summarizes work by Nick Jones, Charlie Moffett and Hans He at NYU Center for Urban Science and Progress in spring 2018. We hope you found this post helpful — get in touch if you’d like to continue the conversation!