Table of contents
No headings in the article.
The Elastic stack, or ELK for short, is a collection of tools that will turn you into a data wizard in no time! Whether you need to store, search, or analyze your data, ELK has you covered.
First up, we have Elasticsearch. This bad boy is a distributed search and analytics engine based on Apache Lucene, and it'll let you index, search, and analyze large volumes of data like a boss.
Next, we have Logstash. This tool is a data collection and processing pipeline that can ingest data from a variety of sources, transform it, and send it to Elasticsearch (or wherever you'd like). It's like a data vacuum cleaner, sucking up all your data and making it nice and tidy. Logstash has a variety of input plugins that allow it to collect data from different sources. In this tutorial, we are using the tcp and http input plugins.
Last but not least, we have Kibana. This visualization tool lets you explore and analyze data stored in Elasticsearch. Think of it as a crystal ball for your data - just ask it a question and Kibana will provide the answer.๐
So, why would you want to use ELK, you might ask? Well, it can help you solve a variety of data problems, such as centralizing and storing log data, analyzing and visualizing data in real-time, searching through large volumes of data, and monitoring and alerting on specific patterns or trends.
In this tutorial, we'll be setting up ELK using Docker and Docker Compose. Docker is a containerization platform that makes it easy to package, deploy, and run applications, and Docker Compose is a tool for defining and running multi-container Docker applications.
Just follow these simple steps to get your own ELK stack up and running:
Make sure you have Docker and Docker Compose installed on your system.
Create a new directory for your ELK stack and navigate to it.
Create a file named docker-compose.yml in your ELK directory and copy the provided configuration file into it.
version: '3.7' services: elasticsearch: image: docker.elastic.co/elasticsearch/elasticsearch:7.17.8 ports: - '9200:9200' environment: LS_JAVA_OPTS: "-Xmx256m -Xms256m" discovery.type: single-node xpack.license.self_generated.type: basic xpack.security.enabled: true ELASTIC_PASSWORD: changeme ulimits: memlock: soft: -1 hard: -1 networks: - elk kibana: image: docker.elastic.co/kibana/kibana:7.17.8 ports: - '5601:5601' environment: SERVERNAME: kibana ELASTICSEARCH_HOSTS: http://elasticsearch:9200 ELASTICSEARCH_USERNAME: elastic ELASTICSEARCH_PASSWORD: changeme networks: - elk depends_on: - elasticsearch logstash: image: docker.elastic.co/logstash/logstash:7.17.8 ports: - '5000:5000' - '8092:8091' environment: XPACK_MONITORING_ELASTICSEARCH_HOSTS: http://elasticsearch:9200 XPACK_MONITORING_ENABLED: "true" XPACK_MONITORING_ELASTICSEARCH_USERNAME: elastic XPACK_MONITORING_ELASTICSEARCH_PASSWORD: changeme volumes: - type: bind source: ./logstash_pipeline/ target: /usr/share/logstash/pipeline read_only: true networks: - elk depends_on: - elasticsearch networks: elk: driver: bridge
Replace the password changeme with your desired password for the Elasticsearch elastic user in the elasticsearch, kibana, and logstash service sections of the configuration file.
Create a new directory named logstash_pipeline and create a file named logstash.conf in it. Copy the below Logstash configuration into the logstash.conf file.
input { tcp { port => 5000 } http { port => 8091 } } output { elasticsearch { hosts => ["elasticsearch:9200"] index => "first-logstash" ssl_certificate_verification => false user => "elastic" password => "changeme" } }
Run below command to start the ELK stack.
docker-compose up -d
To access Kibana, open a web browser and navigate to localhost:5601. You will be prompted to enter the username and password for the Elasticsearch elastic user. Use the password you set in step 4.
And that's it! You now have a shiny, new ELK stack, ready to help you make sense of your data. Go forth and conquer! ๐