Maps with Docker

MapTiler
MapTiler Blog
Published in
5 min readApr 18, 2018

Working with a huge amount of geographic data is not an easy task. By using Docker, you can easily host maps of the entire world in 10 minutes — or process petabytes of satellite and aerial data on a computer cluster.

Our solution based on Docker technology made it to the DockerCon conference — so you can learn a bit of our know-how — and get tips and best practice for how to run maps on your own infrastructure.

Self-hosted world maps

When a company is considering using a map, the first option is usually one of the big free map-as-a-service providers. What most people consider free has, in fact, its cost — the personal data of visitors of the maps are shared with the map provider, and as soon as the map is used inside of a product or on a popular website there is a large bill to pay to the map provider. The price significantly increase if the number of visitors jumps higher. It is impossible to have offline maps, world maps behind a firewall, there are restrictions for asset tracking and customisation of the look&feel of the map. All those limitations can be avoided with self-hosted maps.

OpenMapTiles project provides you with world maps based on OpenStreetMap. This collaborative open-source mapping project includes all important data for a base map like streets, houses, roads, landuse data, points of interest and much more. On top of this, we can offer also digital elevation model and contour lines for building outdoor maps, and even satellite imagery of the whole world. You can add your own geodata and use one of the free beautiful map styles to customize the look of the maps for your business purpose and brand.

OpenMapTiles Server, which enables you to run all of this from your own infrastructure, is using Docker container technology. The hardware requirements are so low so you can run it even on your laptop. There is also no need to be connected to the internet, therefore the maps can be used offline or behind the firewall. The OpenMapTiles project is open-source, the community-driven repository can be found on GitHub.

Petr Pridal and Martin Mikita from the MapTiler team presenting about hosting maps from own infrastructure at DockerCon 2017

World map in 10 minutes using Docker

The OpenMapTiles Server is available in Docker hub or, if you already have Docker installed, by launching the container from Kitematic or running this command:

docker run --rm -it -v $(pwd):/data -p 8080:80 klokantech/openmaptiles-server

Then visit the webpage http://localhost:8080/ in your browser and you will be guided through the short wizard, where you select if you want to serve the whole planet, a country or a city, which of the prepared map style you want to use or if you want to use your own, default language for the visitors, what kind of services you want to run (raster tiles, vector tiles, WMTS, WMS or static maps).

As said before, the hardware requirement are really low: pre-generated vector tiles of the whole world have just some 50 GB, therefore the minimal requirement is set to 60 GB hard drive space and 8 GB RAM. If you want to serve raster tiles too, it is recommended to slightly increase the confirmation, as rasterization happens on-demand from the vector tiles. To provide also contour lines, hillshading or satellite layer, you will need more disk space. Serving pre-generated tiles has an advantage in lower hardware requirements compared to serving maps on demand from a live Postgres database — and maps are also much faster. Also, the configuration is significantly easier and setting up the server is a task for a person with basic IT knowledge.

Map of the entire world can run on a computer with 60 GB hard drive space and 8 GB RAM.

The solution based on OpenMapTiles Server is fully scalable (horizontally + vertically); the scaling could be done dynamically based on the workload. Scaling requires just OpenMapTiles Server and Memcached software and a data volume attached. It is possible to simply multiply the machines. You can achieve the same result using Swarm / Kubernetes.

Short graphical wizard where you select which services you want to run

Processing geospatial Big Data on a Docker cluster

Our technique of processing big geodata on a cluster is using Docker container technology.

We start by dividing the whole work into smaller jobs and sending it to the separate machines. Each of them is constantly reporting metrics and logs to the master server. Once the job is marked as done, finished work is sent to the output storage, the control server sends another work and the whole process starts again. The input and output storage can be anything from own server to cloud services like Amazon S3 or Google Cloud Storage.

With the process described above, we were able to render the whole world OpenStreetMap data with OpenMapTiles project (126 million tiles) during one day using 32 machines, each equipped with 4 cores. This job will otherwise take ~128 days of CPU time. The same technique was used for rendering raster data using our MapTiler Cluster. By using a cluster of computers, we were able to convert 60 TB USA aerial imagery, create OMT Satellite or process petabytes of satellite data for clients.

People interested in the application of this technology can contact us and we would be glad to help with deployment.

MapTiler Engine web GUI

Want to learn more?

If you want to learn more, check the video from our presentation which is available on the Docker page or YouTube. And do not forget to view the slides at Slideshare.

--

--