The City That Listens Before It Talks
IMAGINE IF EVERYTHING IN A CITY IS INPUT ALLOWING THE CITIZENS TO CHOOSE THEIR DESIRED OUTPUT.
IOT IN SMART CITIES — A FUTURE SCENARIO
Imagine this scenario:
You are standing outside your home, listening to the sounds of the city. You hear kids playing in the park on the other side of the street, birds chirping above you and a car honking — you bet it is a taxi driver. But you cannot see it, you can only guess. Your ears are your primary sense — you have been blind your entire life.
You are on your way to a coffee shop to meet up with an old friend. You take your phone out of your pocket and ask Siri to open Smart City Mode. The setting is already in Blind Mode, so all you need to do is add the destination. “Walk straight ahead,” the app tells you. Wait a second, why not also tell the app to give you a culture tour on the way?
With the ability to choose different kind of city modes, the Smart City app will be multi-functional, not only for blind people, but for anybody.
WHAT IS YOUR NEED?
+ Traffic information for the blind,deaf, kids, parking etc.
+ Best food/coffee spots nearby+ Pollution alerts
+ Culture tours
+ Tap your data into the city (preferences, history)+ And many more …
With data collection, your phone can predict user patterns and give suggestions in different categories. Every physical object in the city is collecting data available from APIs, so anyone can develop a smarter city.
How can this ecosystem in cities be used for security? For protecting the environment? The opportunities are endless.
LOOK OUT FOR THESE INNOVATORS:
This is a series of IoT posts — read the introduction here