DATA STORIES | GEOLOC INFO | KNIME ANALYTICS PLATFORM

iOS Shortcuts and KNIME to analyze geoloc info

Using no-code tools to explore your data

Diego Romero
Low Code for Data Science

--

As first published on Solvitur Ambulando

I like to disconnect as much as possible on vacation but also learn or try something new. On this occasion, I have insisted on reviewing how many places I have been in the last few years. Thanks to the geopositioned photos, this information is available, but how can I extract it from them in the simplest way possible? iOS Shortcuts and KNIME, as #nocode tools, are chosen to analyze the geoloc info.

Photos in Boston with geolocation information using iOS native application.

This year has been difficult, it is true, but luckily all my family and friends are doing well. During the Christmas holidays, I always allow myself a bit of hindsight, and indeed I am lucky. And if I look back, even more. Even this year when we have restricted our social contacts and our travels to the maximum, we have been able to stretch our legs a bit when possible.

If you use an iPhone, viewing this information is very simple. Just go to the «Places» view to display the photos grouped by location. The problem begins if you want to use that information to do any other type of analysis or use a visualization service like Open Street Map to show it by years or countries? There is no way to export that information.

Wait… but what if I use iOS Shortcuts?

I’ve been spending time on truly #nocode tools for a while. The premise is that you do not have to use any specific programming language to avoid installing IDEs, compiling, testing, etc. Although there are different opinions about these tools, the truth is that they are really easy to use if you have ever programmed. However, you still need some data literacy otherwise you hardly scratch the surface of the possibilities they offer.

Let’s define the problem: Extracting the geolocation information from the photos that I have on the phone for a period of time and export them in a standard format readable by other analysis tools.

If we used brute force, we could export all the photos with their metadata and start from there, but what is the point if we already have access from the terminal itself? Let’s test if the native iOS tool, Shortcuts, has the necessary components to solve the extraction problem. In addition, this way we will avoid having to have a space (not negligible) where to store the photos.

Functionalities needed from iOS Shortcuts to solve my extraction challenge.

#nocode workflow: geoloc info

So… let check it out. I will first test with the photos of the last month to ensure that everything is fine and to be able to check the processing times.

iOS Shortcut workflow (device language = Spanish)
iOS Shortcuts workflow to extract latitude and longitude information from our photos

The workflow information is in Spanish but you can download it to run it from your terminal and with your photos. Here are some notes to be able to modify it at your convenience:

  • In the third block, the file name is requested where the extracted information will be saved in a standard CSV format separated by commas, in this case by semicolons. You also have to indicate the extension «.csv».
  • You can experiment with the different types of filters available in the photo access block: it belongs to a specific album, if it is a screenshot, type of orientation, if it is marked as a favorite, …
  • I have included a small control to verify that the photo does indeed have location information to avoid later errors.
  • In my case, I use iCloud as a remote storage service. When you import it on your device, you can choose the service that seems best to you.

In my case, my library has 8,140 photos with location information. The workflow was able to correctly process all the photos in a time of 09:20 minutes, that is, a little more than 14 photos per second. I used an 11″ 256Gb iPad Pro 2020 for this.

Next step: Basic Data Analytics with KNIME

So far so good. We already have the info in a standard CSV file. As you may notice, I’ve also included all the metadata from the photos in the last column just in case. The next part of the challenge is to be able to upload and view that information in an open service like Open Street Map. In this case, I will use a data processing and analysis tool, KNIME Analytics Platform, which appears in the Gartner Magic Quadrant and has a full Open Source version (you can download it here).

My intention is not to explain in detail how to use this tool and that is why I have made a version of the workflow available in the KNIME Hub so that you can use it together with the file generated in the previous step. The platform has an amazing level of documentation, learning material including blogs for beginners, advanced books for Codeless Deep Learning or YouTube channel for tutorials.

The platform has enormous possibilities using it as a visual programming tool, maybe not exactly #nocode but #codeless. But do you need programming to extract the information received from an API in JSON format? Or learn and implement in Python the secrets of Google authentication to update a spreadsheet? Not at all and you will see it now.

KNIME workflow to visualize geoloc info from iOS Photos.

And this is only the beginning

This is the workflow generated with the desktop version of KNIME Analytics Platform. It only has one entry: the CSV file generated with iOS Shortcuts. From there, we ingest the information of each column, a small check is made of the latitude and longitude ranges to ensure that they will then be painted correctly on the map and the information of the year is extracted from the photo to be able to segment them by that variable.

So far, not a single line of code. The main thing is to identify the components (ETL, row filtering, ranges, …) that we will need to process the information in order to solve the challenge we had set ourselves. Is it interesting that you know something about RegEX? Of course. Necessary? Absolutely not.

If you download the workflow, you can see how to link the information with Google spreadsheets and use them as a bridge to Google Data Studio. I’ve also included a branch to get the info about the region of each localization using NOMINATIM API service (reverse geocoding).

This is the result in Open Street Map.

Only a sample focused on the Iberian peninsula.

Conclusions

After all, each of the parts has been solved using just two #nocode tools and connected by a standard data exchange file. Also, based on that information, I have also included access to an external API without a single line of code.

This same workflow can be used to further expand the branches of analysis including AI capabilities for prediction or categorization. For example, local tweets from the days when the photos were taken could be retrieved.

A small database like SQLite could also be included to better manage the data. The connectors to manage databases cover almost all the possibilities, whether SQL, Databricks or hosted in Azure or AWS infrastructure.

Just taking into account the automation part, it would already be worth using this type of tools.

--

--

Diego Romero
Low Code for Data Science

Engineer and Camino’s lover. Passionate about Generative AI. Imagining Emerging Techs solutions. Always learning…