ELK logger and alert system in Node.js

ELK logger and alert system in Node.js

15 June 2021

We can ensure the performance and availability of our application by analyzing the data generated by it. The data could be event logs or metrics which enables monitoring, identification and resolution of issues and this is where centralized log management and analytics solutions such as the ELK Stack come into the picture. ELK Stack comprises three open-source products — Elasticsearch, Logstash, and Kibana. Logstash extracts the logging data from different input sources,  transforms it and later sends it to a stash like Elasticsearch. Kibana is a web interface, which accesses the logging data from Elasticsearch and visualizes it.

Elasticsearch: This is basically a NoSQL database, specialized in search. So it acts as the main storage for log events, making them easy to search and retrieve later on. Elasticsearch facilitates near real-time search and analytics for different types of data. It can efficiently store and index structured, unstructured, numerical or geospatial data in a way that supports quick searches. The steps of setting up and running Elasticsearch are:

  • Download the latest version of elasticsearch.
  • Run the elasticsearch.bat using the command prompt. Elasticsearch can then be accessed at localhost:9200

Logstash: Logstash can be considered as a log aggregator that collects data from various input sources, executes different transformations and ships the data to various supported output destinations. The steps for setting up and running Logstash are:

  • Download and unzip Logstash
  • Prepare a logstash.conf config file. Logstash configuration files are in the JSON-format and reside in ./logstash-7.9.3/bin. Logstash configuration involves three sections i.e. inputs, filters, and outputs. Let’s create a configuration file called logstash.conf and set up tcp input:
    input {
    
    tcp {
    
    port => 5000
    
    type => document_type
    
    }
    
    }

    Now add a filter for incoming messages. Logstash provides a large number of filter plugins to transform the logs. The most commonly used filter plugin are:

    grok – It allows to give structure to unstructured logs.

    filter {
    
     grok {
    
       match => { "message" =>"%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:log-level} \[%{DATA:class}\]:%{GREEDYDATA:message}" }
    
     }
    
    }

    mutate – We can use the filter to change fields, join and rename them.

    filter {
    
     mutate {
    
       lowercase => [ "log-level" ]
    
      }
    
     }

    date – It can be used to pull a time and date from a log message and define it as the timestamp field (@timestamp) for the log.

    filter {
    
          date {
    
            match => [ "logdate", "mm-dd-yyyy hh:mm:ss" ]
    
        }
    
    }

    json – To maintain the JSON structure of incoming messages, the Logstash json filter plugin enables to extract and maintain the JSON data structure within the log message.

    filter {
    
     json { source =>"message"
    
            target => "log"
    
       }
    
      }

    kv – Logstash kv filter plugin allows to automatically parse messages or specific fields.

    filter {
    
      kv {
    
          source => "metadata"
    
          trim => "\"
    
          include_keys => [ "level","service" ]
    
          target => "kv"
    
       }
    
    }

    Output is the last stage in the Logstash pipeline, which sends the filter data from input logs to the specified destination. Logstash supports multiple output plugins to stash the filtered logs to different storage like storing the filtered logs in a File, Elasticsearch Engine, stdout, AWS CloudWatch etc. Elasticsearch output can be configured as:

    output {
    elasticsearch {
    
    hosts => ["127.0.0.1:9200"]
    
    index => "ELK_test_logs"
    
        }
    
    }
  • Run bin/logstash -f logstash.conf

Now, Logstash is configured to listen for input on port 5000. Also on getting some input, Logstash will filter and index it to elasticsearch. We can use either logstash-client which is a general purpose logging library with multiple transports or other logstash clients for winston, bunyan etc for transfering the events logs to logstash.

Kibana: Kibana is an open source visualization and exploration tool that works on top of Elasticsearch, providing users with the ability to analyze and visualize the data. The steps of setting up and running Kibana are:

  •  Download the latest version of kibana.
  •  Run the kibana.bat using the command prompt. kibana UI can then be accessed
    at localhost:5601

Kibana alerts can be created from Management UI by setting a watcher to monitor the data and send emails or post something on slack when the event occurs or in a variety of apps including Metrics, Security, Uptime, APM. If an anomaly is detected in our Elasticsearch data we can use a wide range of connectors to send alerting notifications: Create messages on Slack, send out an email, open a JIRA issue, trigger a PagerDuty event, write data back to Elasticsearch or post to a Webhook.

Request a quote