Making the inspector123 App

Recently I was hired by a Hudson Valley company called Zelacom to make an app for their inspector123 platform. Inspector123 is a searchable nationwide database of field inspectors, who inspect sites like restaurants before they get insured.

These inspectors go to a site, take a ton of photos, and — until now — had to do a complicated dance of email and file transfers in order to upload their findings. This is truly a perfect use case for an app: now inspectors can take photos, add captions, and sync over LTE or wifi, all in one simple app.

It was a privilege to get an opportunity to design and build the app from scratch. Here’s how the process went.

(And here’s the finished app for iOS and Android.)


It’s always exciting to start with a blank slate, but the first important step is to narrow everything down! We started by making a detailed spec for version 1.

Then I went through and did a time estimate, broken down by feature.

From there, I moved to pen-and-paper to do some rough sketches of the interface.

Just kidding… I’ll spare you the sketches… let’s move on to Balsamiq wireframes:

It’s interesting to look back and see what changed as a result of iteration, feedback, and testing. I’ll show these same screens later on in the process…

But first, here are the wireframes translated into visual designs in Sketch:

From here it was time to get started building!

This is a React Native app, cross-platform for iOS and Android, with responsive screen size so it works across phones and tablets. I decided to use Redux for state management and redux-persist for caching data across app sessions. I also decided to use react-navigation to handle navigating between screens.

I’ve raved about React Native before, but I just want to reiterate how awesome it is, and how active the community is. Throughout the project I was active on GitHub and StackOverflow, asking questions and even submitting my first open source pull request! The community around React Native is truly fantastic and all of my questions were answered, sometimes within hours.

I ran into a few platform-specific issues, specifically around voice recognition and photos. Native voice recognition is handled differently on iOS and Android; for example, Android cuts off voice input by default after even the slightest pause. That’s not React Native’s fault though… it’s a platform-wide decision made by Google. Meanwhile I ran into some camera issues caused by manufacturer-specific software on Android. For example, HTC and Huawei customize the default camera behavior on Android. That affects how apps interact with the system camera. Android fragmentation is a real issue! However I was able to overcome that thanks to timely updates in react-native-image-picker.

The rest of the build process was a mix of collaboration, testing, and continual feedback cycles. There are a lot of things you can only learn after trying a real app, and I used TestFlight and Google Play Alpha to distribute builds every couple days.

By the end, here’s what it looked like:

This project was a blast to work on, and I’m excited to keep iterating on the app as inspectors start to use it in the field.

It’s available now for iOS and Android.

Any questions, feel free to reach out at info@dangurney.net.