Search Location with Golang and Elasticsearch Geo Location

Agung Yudha Berliantara
Ralali Tech Stories
6 min readMar 13, 2020

Hi everyone,

Through this article, I’ll share my research about how to create a simple search engine with Golang and Elasticsearch Geo Location. However, this search engine only limited to search the nearest or the farthest malls location from our location. In this example, I will use malls location in Jogjakarta. In this article, we will create a simple API without special setup. All the settings default from Golang, Gin, and Elasticsearch.

Golang and Elasticsearch Geo Location
In Golang, We trust!

Before we start our game, please note that the specifications that I use are:

  1. Golang v1.13.3
  2. Elasticsearch version number 6.8.4
  3. OpenJDK version 1.8.0_232

Everything works fine in my machine without additions, such as Docker, etc. Maybe the settings will be different if you set with other specifications.

Ok, let’s start…

I assume that you have installed all the requirements (points 1–3) above and understand the fundamentals of Golang and Elasticsearch that we will use. I will not discuss how to install Golang and Elasticsearch.

Data Mapping

First, we need data that will be stored in Elasticsearch. To map data we use the index name, that is trial_geo and the name type is stores. For more details, please see below:

{
"mappings": {
"stores": {
"properties": {
"store_name": {
"type": "keyword"
},
"location": {
"type": "geo_point"
}
}
}
}
}

Please note, for the location field we use geo_point type which can be used to store latitude longitude data. For storing geo_point type data you can use a various methods. You can read it in the Elasticsearch documentation. In this tutorial we save the geo_point data as a string. For example:

PUT my_index/_doc/2
{
"text": "Geo-point as a string",
"location": "41.12,-71.34"
}

Latitude and longitude in field location can be separated by a comma (lat,lon). The following example will show how data will be stored with the index name is trial_geo and the ID is jcm .

ElasticSearch store data Geolocation
Elasticsearch Geo Location

I have prepared some data for example and for your experiment.

List of Mall in Jogjakarta

After all the data has been prepared in Elasticsearch. The data will look at least like this:

Example data stored in Elasticsearch

Well, because the environment and data have been prepared, we will now implement it with Golang. So the next step, we need to prepare the packages that we will use in Golang.

  1. go-elasticsearch https://github.com/elastic/go-elasticsearch, it is needed to connect golang with Elasticsearch,
  2. Gin https://github.com/gin-gonic/gin, Gin is a HTTP web framework written in Go (Golang),
  3. Refresh https://github.com/markbates/refresh, so the system will automatically rebuilds after we save our work without having go build or go run main.go repeatedly.

Use go get to get all the packages.

Create a Golang project with directory and file structure as below:

elasticsearch-geolocation
├── main.go
├── Handlers
│ ├── HomeHandler.go
│ └── SearchHandler.go
└── refresh.yml

Let me explain about main.go first. This file will handle all requests from users and responses from Elasticsearch to display in API.

main.go

main.go

Nothing special in this source code file, we just need 2 packages that we will need to create this search engine, that is github.com/elastic/go-elasticsearch we will need this to interact our application with Elasticsearch, and github.com/gin-gonic/gin as HTTP framework. In line 15–16 we use the default setting from the packages. And then we create routes in 19–59 and a group named /api.

For routing to the home index API, we need to call HomeHandler.go (line 21–23). We also send variables named es and ctx as arguments for the function HomeHandler() that we will discuss later. In line 26–58 we create filters for URL Query which can be used for processing and returns error response if the query parameter latlon is not filled. And we call Handlers.SearchHandler(order, unit, limit, latLon, es, ctx) that is in the SearchHandler.go to do a searching process according to the position of latitude and longitude.

We move to the Handlers directory…

In directory Handlers, there are 2 files, HomeHandler.go it is for index for our API, this file only returns a version of Elasticsearch. And SearchHandler.go to write logic code to interact with Elasticsearch.

HomeHandler.go

HomeHandler.go

In HomeHandler.go we need to import github.com/elastic/go-elasticsearch as search engine library and github.com/gin-gonic/gin as HTTP framework. The parameters that needed for HomeHandler() are es *elasticsearch.Client dan context *gin.Context . es is to pass data configs from main.go , which will be used to call other functions to do a search.

In line 15 we call es.Info() which used to retrieve information from Elasticsearch engine. And in line 20 is to decode response from es.Info() and store it into variable r .

SearchHandler.go

SearchHandler.go . And this is our playground. Let’s rock!!!

SearchHandler.go

We need to import a package, github.com/elastic/go-elasticsearch . It is used to use functions that support to process the data from Elasticsearch. We defined a function SearchHandler(order, unit, limit, latlon string, es *elasticsearch.Client, context *gin.Context) . The function has several parameters, namely order to define sort type whether asc or desc. And then parameter unit is used to define the distance unit such as km (kilometers), m (meters), or miles. We have parameter limit to limit the data displayed. And latlon , it is to get data for latitude and longitude. All of this are string , and only the limit will convert to integer .

In line 21, we will split data from latlon , actually it is 2 strings separated by comma. For example 41.12,-71.34 , we separate it into 2 array indexes, index 0 will be a latitude point (41.12), and index 1 will be a longitude point (-71.34). The variable sort will be converted into json which will produce a json query like the following:

{
"sort": {
"_geo_distance": {
"location": {
"lat": splitLatLon[0],
"lon": splitLatLon[1]
},
"order": __order__,
"unit": __unit__
}
},
}

To do a search using geo location, we need _geo_distance and we can fill the field lat with our latitude point and lon with our longitude point. In this case, we use location to define latitude and longitude for data mapping. You can scroll up to review our data map.

We define lat and lon as objects. We use objects so we can read easily which one is latitude and which one is longitude. Or if you want to change it into a string, geohash or array, you can read the guide from ElasticSearch https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html.

And then, field order we defined order method whether it is asc or desc. And we defined the data unit as m (for meters), km (kilometers) or miles. Both are optional.

We have created a query search that has been converted into json before. To do a search we call es.Search() . Here, we can fill several arguments to process the data.

search, err := es.Search(
es.Search.WithSize(limitInt),
es.Search.WithIndex("trial_geo"),
es.Search.WithBody(&buf),
es.Search.WithPretty(),
)

Well, for the arguments are very easy to understand, because each name represents the usage. Function es.Search.WithSize(limitInt) that we used above is to limit the number of data to display. And we use trial_geo for the ElasticSearch index.

Before we return the data to the client, we have to decode the data first so we can process it. This is optional , but I think the data generated from ElasticSearch is too long, so we only take the data we need. You might see data like this:

{
"took": 10,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"skipped": 0,
"failed": 0
},
"hits": {
"total": 7,
"max_score": null,
"hits": [
{
"_index": "trial_geo",
"_type": "stores",
"_id": "jcm",
"_score": null,
"_source": {
"store_name": "Jogja City Mall",
"location": "-7.753336,110.3598087"
},
"sort": [
5.49279121597844
]
},
......... // more data
]
}
}

Because the data we need only in the second field hits , then we can process the data like in line 81.

refresh.yml

And, OK, our game is almost over. I will explain to you about refresh.yml . This file is used to configure auto refresh/ auto build when we save our file after we work on it, so that we don’t need to run go run main.go or go build repeatedly.

refresh.yml

Almost done, we have one thing we haven’t done, yeah, it’s about how it works.

If you already know how to run this application, that’s great. But if you want to find out more I will explain in my next article. Can’t wait? See you~

--

--

Agung Yudha Berliantara
Ralali Tech Stories

A curious software engineer and proud as a dad of a little angel