Building Context-Aware Notifications

Why we think location—as well as motion detection—matters when it comes to sending great news notifications, and how we used Apple’s CoreMotion framework to bring them to life.

Ajay Chainani
The Lenfest Local Lab @ The Inquirer
5 min readDec 4, 2018

--

Image by Dave 77459 used under CC BY

How relevant or useful a news notification is to someone depends a lot on the content — are you interested in this political outcome, game result or breaking local news? It’s usefulness also depends on the action it lets you take — can you read an article, share an infographic, watch a live stream or set a reminder? In addition to the content and utility of an alert, there was another factor we thought was getting overlooked to make a news notification relevant or useful: the user’s current location.

Why ‘current location’ matters to notifications

One way to think about how someone’s location matters when receiving an alert is to consider these two examples: You’re driving and you get a notification about playing the next level of Fruit Ninja, or you’re walking in the park on a Sunday afternoon and you get a notification about a cheap gas station nearby. These notifications probably aren’t relevant or useful to the person receiving them. We wondered if a well-timed, location-aware news notification could be a better experience, and that’s why we built HERE for Local Journalism, an app that sends local news notifications as people approach the place a story has been written about.

Why current location alone wasn’t enough

However after building our first prototype, we realized that if we triggered notifications based on location data alone, that might be a bad experience. For example, one of our internal beta testers reported receiving “maybe ten” notifications while in a cab going up a city street in Philadelphia. She couldn’t read the stories about the places she was passing by, and it wasn’t the ideal experience.

Keying in on that feedback, our UX designer Faye Teng helped us think through other situations someone might be in when they received a location-aware alert: How would the app work if they’re biking or driving? What happens if people are in a rush? What does the app do if someone was on public transportation, or if someone was in a city versus a suburb?

These questions forced us to think about whether people should receive alerts while driving, but we still weren’t sure what do. We thought about sending the alerts silently or not sending them at all to reduce distraction and increase relevance. Either way, we knew that our application would need to be able to tell if someone was moving so it could react accordingly.

Was there a way to solve this problem?

Apple’s CoreMotion framework gives developers access to the data generated by a devices’ various sensors: accelerometers, gyroscopes, pedometer, magnetometer, and barometer. This is the data we would need to predict a user’s movement, so we started researching how to implement the framework. What we didn’t know at the time was exactly how much information the framework would give us about someone’s activity, or how tricky it would be to implement it in a way that made sense.

Luckily with the introduction of the extremely energy efficient M7 chip and the CMMotionActivityManager in 2014, Apple made it considerably easier to predict what an app user is currently doing. Here are the categories provided by the CMMotionActivityManager:

  • Walking
  • Automotive
  • Stationary
  • Cycling
  • Unknown
  • Running

It also provides you with a confidence rating for each prediction, ranging from high to low. It can also sometimes report a user in two states at once — i.e. “Automotive, high confidence” and “Stationary, low confidence”.

Prototype screenshots showing how we exposed Apple’s motion activity information during testing.

The following is a simplified snippet of our code that predicts an app user’s activity:

func startTrackingMotionActivity(handler: @escaping (CMMotionActivity) -> Void) {
manager.startActivityUpdates(to: .main) { (activity) in
guard let activity = activity else { return }
if let lastActivity = self.currentActivity, lastActivity.automotive {
self.stoppedDrivingAt = Date() // now
}
self.currentActivity = activity
handler(activity)
}
}

A wrinkle we found was that these activity predictions could differ for members of our team. One person’s phone predicted them in the “automotive” state while on an elevated train, while another person was predicted as “unknown” on an underground train.

In the end, and for the purposes of experimentation, we decided to skip sending notifications when someone is in the “automotive” or driving state, and also — to account for stop lights — skip alerts if they had been driving any time within the last two minutes. We also skip notifications when someone’s predicted state “unknown”, since in our testing that happened most often when someone was on a train, and we don’t want people to receive alerts in that context.

Here is another snippet of code that skips notifications:

var isDriving: Bool {
guard let activity = currentActivity else {
return false
}
return activity.automotive
}
var drivingThreshold: Int {
return 2
}
var hasBeenDriving: Bool {
if isDriving { return true }
guard let lastDroveAt = self.stoppedDrivingAt else {
return false
}
return lastDroveAt > self.drivingThreshold.minutes.ago
}
var isUnknown: Bool {
guard let activity = currentActivity else {
return true
}
return activity.unknown
}
var skipNotifications: Bool {
// ensure notification when motion unavailable
guard MotionManager.isActivityAvailable() else {
return false
}
// if state is unknown skip notifications
if self.isUnknown {
return true
}
return self.hasBeenDriving
}

We launched our app recently and are still gathering data around the effectiveness of these context-aware notifications and we’ll be posting updates here on Medium.

In the meantime, we’ve open-sourced all of our code for the app and its CMS here. Please contact us if you have any questions or would like to try launching the app in your own area.

The Lenfest Local Lab is a small, multidisciplinary product and user experience innovation team located in Philadelphia, PA supported by The Lenfest Institute for Journalism.

The Lenfest Institute for Journalism is a non-profit organization whose mission is to develop and support sustainable business models for great local journalism. The Institute was founded in 2016 by entrepreneur H.F. (Gerry) Lenfest with the goal of helping transform the news industry in the digital age to ensure high-quality local journalism remains a cornerstone of democracy.

--

--

Ajay Chainani
The Lenfest Local Lab @ The Inquirer

Japan + NYC. iOS developer. Startup advisory. Previously @spring @techstars @500startups, @sonar, @loudieapp. 日本語 OK! http://ajay.jp