Another Look at Three iOS 10 Frameworks

One of the reasons I wanted to learn Swift is my interest in the hardware possibilities of a modern smart phone. When you write software for iOS, you have powerful hardware control right out of the box.

This post is a short look at some of iOS 10’s new features that still seem under-utilized to me, particularly in conjunction with recent 3rd party APIs for artificial intelligence and machine learning.

Haptic Feedback

You’ll need a 7 or a 7 Plus, but the newUIFeedbackGenerator seems like the next simplification of interface design. If icons are generally better than words for conveying information quickly, think about the power of subtle touch.

Make a messaging app that uses touch instead of emojis to convey feeling. A virtual instrument that subtly vibrates with the sound output. Finally actualize the Facebook ‘poke’ as a standalone app.

Robust Notifications

Aside from being able to customize your own notifications, new possibilities are unlocked for how and when you can notify users from your app. Trigger a notification when a user enters a location (or does some other physical activity like reaches a certain altitude), hits a date in their calendar, and add custom actions for the notification.

This seems powerful in thinking about apps not as contained user experiences, but as a global trigger for the user’s environment.

Continuous Transcription

SFSpeechRecognizer allows continuous transcription of real time and recorded audio. The results are surprisingly accurate.

Fundamentally, combining this with new speech and language AI’s is really exciting. IBM Watson’s Alchemy Language or Google Natural Language are the obvious players. Watson also has a translator and audio producer, which could be your best friend in a foreign country.

Like what you read? Give Ben Bernstein a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.