Running the Elasticsearch on Docker

Burak KILINC
3 min readNov 2, 2022

--

Elasticsearch is the distributed search and analytics engine at the heart of the Elastic Stack. Logstash and Beats facilitate collecting, aggregating, and enriching your data and storing it in Elasticsearch. Kibana enables you to interactively explore, visualize, and share insights into your data and manage and monitor the stack. Elasticsearch is where the indexing, search, and analysis magic happens.

Elasticsearch provides near real-time search and analytics for all types of data. Whether you have structured or unstructured text, numerical data, or geospatial data, Elasticsearch can efficiently store and index it in a way that supports fast searches. You can go far beyond simple data retrieval and aggregate information to discover trends and patterns in your data. And as your data and query volume grow, the distributed nature of Elasticsearch enables your deployment to grow seamlessly right along with it.

While not every problem is a search problem, Elasticsearch offers speed and flexibility to handle data in a wide variety of use cases:

Add a search box to an app or website Store and analyze logs, metrics, and security event data Use machine learning to automatically model the behavior of your data in real-time Automate business workflows using Elasticsearch as a storage engine Manage, integrate, and analyze spatial information using Elasticsearch as a geographic information system (GIS) Store and process genetic data using Elasticsearch as a bioinformatics research tool We’re continually amazed by the novel ways people use to search. But whether your use case is similar to one of these, or you’re using Elasticsearch to tackle a new problem, the way you work with your data, documents, and indices in Elasticsearch is the same.

You can follow on docs of elasticsearch.

Pulling the image
Obtaining Elasticsearch for Docker is as simple as issuing a docker pull command against the Elastic Docker registry.

docker pull command
  1. Starting a single node cluster with Dockeredit
    To start a single-node Elasticsearch cluster for development or testing, specify single-node discovery to bypass the bootstrap checks:
docker run command
docker run command

2. Starting a multi-node cluster with Docker Composeedit To get a three-node Elasticsearch cluster up and running in Docker, you can use Docker Compose:

Create a docker-compose.yml file:

dcoker-compose.yml
docker-compose.yml

After you handled starting elasticsearch duty, you can check the below script in postman or insomnia.

get command
result of the get command

Thanks for reading…

--

--