Understanding Logstash – Open Source Log Shipper

 In Log

Our previous blog Introduction to Log management covers various areas in Log management such as Log Configuration, Shipping and Management along with some of the popular tools for each areas. This blog outlines the details on Open source Log shipper Logstash through a set of frequently asked questions.

What is Logstash?

Logstash is an Open source tool for receiving, processing and shipping logs. It can also create a powerful pipeline for storing, querying, and analysing all kinds of logs such as System logs, webserver logs, error logs, application logs and just about anything you can throw at it. It can also be seen as a data collection engine with real-time pipelining capabilities

What are the logs that Logstash can parse, filter and ship?

Logstash provides a plugin based mechanism to parse, filter and ship log from source. It contains more than 200 plugins for getting logs for various applications and systems. It allows to write our own plugin as well. The list of plugins are available in https://github.com/logstash-plugins?utf8=%E2%9C%93&query=

Does Logstash have User interface to view the shipped logs?

Kibana is used as a frontend reporting tool

What are the data destination to which Logstash can ship data?

Logstash can be configured to send data to Elastic Search, MongoDB, Redis, etc. as a backend store. Elastic search is the most preferred backend. The diagram below shows the key components of Logstash

What are the tools supported by Log stash in various categories?

Analysis: Elasticsearch, MongoDB and Riak

Archiving: HDFS, S3, Google Cloud Storage

Monitoring: Nagios, Ganglia, Zabbix,Graphite,Datadog,CloudWatch

Alerting: Watcher with Elasticsearch, Email, Pagerduty, HipChat, IRC, SNS

Do we need a separate configuration for each application or software to parse logs using Logstash?

Yes. There are plugins for input, Filter output and codec. There are plugins for specific applications.

How do we configure logs for Logstash?

Here is the typical configuration file that defines the input, filters, output section

input {
file {
path => “/Users/palecur/logstash-1.5.2/logstash-tutorial-dataset”
start_position => beginning
filter {
grok {
match => { “message” => “%{COMBINEDAPACHELOG}”}
geoip {
source => “clientip”
output {
elasticsearch {
protocol => “http”
stdout {}

The input plugins details are available in https://www.elastic.co/guide/en/logstash/5.1/input-plugins.html

The filter plugin details are available in https://www.elastic.co/guide/en/logstash/5.1/filter-plugins.html

The Output plugins details are available in

Do we need agent in VM?

Yes. Logstash to be installed and configured locally – Refer deployment diagram below.

What are the deployment architecture supported by Logstash?

Deployment architecture supported are depicted in the figure below

Does Logstash work in Windows?

Yes. There are plugins for Windows Event log

Is it possible to trigger some alerts based on the logs?


How do we manage log rotation in Logstash server and in the VM?

Logstash archival policies can be used. It supports AWS S3 for storing archived logs.

What are the API supported by Logstash?

It provides Command line support. Most of what we require can be achieved by script itself.



Do we need to restart Logstash if we change the pipe line configurations?

Yes this is required, every time you make a change to configuration file the service has to be restarted. I see this is one of the feature Logstash team is working on , they are planning to introduce few APIs for this features like Reload, Add filter, add Output etc..

Does Logstash support multi tenancy? Managing multiple customer or multiple environment logs in the same server with segregation?

It allows to enable multi tenancy using “Shield”. Shield is an additional licensed Product. 30 days free trial is available


How do we configure alerts?

The alert option is available in Elasticsearch. This can be done using additional licensed product X-pack which includes “watcher”


How do we automate the deployment and configuration of Logstash

There are multiple ways to automate the deployment and log life cycle operations such as deployment, configuration, Log rotation, Archival, Log purging etc. Cloudenablers Corestack provides a pre-built templates for handling the entire Log life cycle operations.

Recommended Posts

Leave a Comment

Start typing and press Enter to search