Logging play a very important role in software development. It helps monitoring, troubleshooting, debugging, event tracing, request tracing, security, and also can be used by Business Intelligence(BI) for reporting purpose.
Logging has numerous benefits. But in this blog post, we will focus on building a real-world example to store application logs in Elasticsearch via a Node.js app by using the Winston NPM package and visualize them using Kibana. Additionally, we will dockerize the entire application.
Prerequisite
Following tools need to be installed in your machine.
- Docker (https://www.docker.com/products/docker-desktop/)
- Node (https://nodejs.org/en/download/prebuilt-installer)
- npm (Node Package Manager): Included with Node.js installation
Create a Node.js App and install required packages
Node.js an asynchronous event-driven JavaScript runtime, which is designed to build scalable network applications.
You many already know how to create node app. But I will guide you to bootstrap basic node app.
I will show you the project structure that I have set up locally. Here is the screenshot of the project structure:
Here are the steps to create a Node.js app.
Create a new folder and run below command to initiate the node.js app.
|
|
npm init
will generated package.json
file in your Node.js project folder.
During initialization, npm ask you number of questionst to collect the data.
The questions and their default answers are listed below:
Packet name: You can add your own project name instead of the default suggestion.
Description: You can give a brief summary of your project.
default dccess point index.js
or you can change it to app.js
. This is the main file that will be run when your project is launched.
Git repository: You can leave default or use use the URL of the repository.
Author: You can enter project’s author’s name.
once all above steps are completed you will see package.json
file inyour project folder.
If you run npm install
then it generates the package-json.lock
file.
|
|
Create a file index.js
file in the root of the project and add below code in the file
Now, you can browse at http://localhost:8000/
leter on the port number will be different once we dockrize the app.
Dockerize the App
Docker is an open source tool that combines your application with all the necessary dependencies and libraries as one portable package (docker image). That package can be shared to any one and run by anyone without much worrying about the operating system. In this post we are using Docker to combines node app, Elasticsearch, and Kibana.
In this post, I am not going to explain how Docker works. I assume that you are familiar with Docker, and will directly jump into the implementation. To Dockerize the the application, first step is to create Dockerfile
with belwo defination. I am placing all the files in the root of the project.
Second sept is to create docker-compose.yml
file with below defination.
|
|
Now, you can run below command to start the app or you can run later once everyting is ready.
|
|
Above command will take few minute to build the docker image and run the container. Once, docker container is up and running the app can be browsed at port number 3002
(http://localhost:8000/
) because internal docker port 8000
is bind to external host machine 3002
port number.
Setting up Elasticsearch
Elasticsearch is a distributed, RESTful search and analytics engine which can be used for many purpose. It can centrally stores all your data for lightning fast search, fine‑tuned relevancy, and powerful analytics that scale with ease. In this post we are using elasticsearch to store applocation’s logs.
Add below configuration to the docker-compose.yml
file so that when you run the docker compose up
command it will pull image form docker hub and build elasticsearch container in your machine within same network where node app container is running.
|
|
Elasticsearch container will be running at port number 9300
of host machine. You can browse at http://localhost:9300/
Setting up Kibana
Run data analytics at speed and scale for observability, security, and search with Kibana. Powerful analysis on any data from any source, from threat intelligence to search analytics, logs to application monitoring, and much more. In this post we are using the visulize and search the applaction’s logs.
Add the following configuration to the docker-compose.yml
file to setup the Kibana in the project. So, that when you run the docker compose up
command, it will pull the image from Docker Hub and build the Kibana container in your machine within same network where node app container and elasticsearch containers are running.
|
|
Kibana container will be running at port number 5701
of host machine. You can browse at http://localhost:5701/
Once everything is setup, youre full docker-compose.yml
file will look like:
|
|
Now we have the configuration for the Node app, Kibana, and Elasticsearch in the docker-compose.yml
file. The next step is to connect the Node app to Elasticsearch to store the logs so that we can view and monitor from Kibana. For this, we need to install the following Node package to the in the app and add some logics to send the logs to the elasticsearch.
winston is designed to be a simple and universal logging library with support for multiple transports. A transport is essentially a storage device for your logs. Each winston logger can have multiple transports.
winston-elasticsearch is an elasticsearch transport for the winston logging toolkit.
After that, here is how you need to create a new logger object. You can crete logger object in the index.js
file or you can create new file so that you can import it into index.js
file. I am createing new logger.js
file with belwo configuration.
|
|
You need to keep few things in mind while creating the logger object.
index : is the index name where you will be storing all the logs that are sent form node app.
node : You need to define the elasticsearch contine URL. For this project URL is defined in docker-compose.yml
file. Make sure you will be defining the internal URL not extranl URL.
You need to import the logger.js
file in the index.js
file so that node app can send the logs to the elasticsearch.
|
|
Now finally you can run the belwo command to run all the containers.
|
|
Above command will take few minutes to build the docker images and run the containers. Once, docker containers are up and running, following containers endpoint should be accessable via browser.
- Node App URL:
http://localhost:3002/
- Elasticsearch:
http://localhost:9300/
- Kibana:
http://localhost:5701/
Visulize the logs in Kibana
You can do many things in the Kibana. But, here we will be only visulizing the logs that are sent form node app.
Open Kibana URL http://localhost:5701/
and go to discover from left menu. Here you can visulize the logs that are sent form the node app and stored in the elastic search.
In above image we can see the log message that are sent form node js app inside the red circle.
In this post I have send logs directly from node application to elastic search. But, in real world we can ingest logs from a Node.js application and deliver them securely into an Elasticsearch using Filebeats.
The blog post provides a practical guide on integrating Elasticsearch, and Kibana with a Node.js app by using the Winston js library for effective logging and monitoring.
This post covers setting up the development environment, creating a Node.js app, Dockerizing it, and integrating Elasticsearch and Kibana using docker-compose.
The post serves as a concise resource for developers aiming to implement robust logging solutions in their Node.js applications.
You can access the full working example at Github Repository : https://github.com/dev-scripts/implementing-logging-using-nodejs-elasticsearch-kibana-and-docker