logstash grok

The ability to efficiently analyze and query the data being shipped into the ELK Stack depends on the information being readable. This means that as unstructured data is being ingested into the system, it must be translated into structured message lines.

This ungrateful but critical task is usually left to Logstash (though there are other log shippers available, see our comparison of Fluentd vs. Logstash as one example). Regardless of the data source that you define, pulling the logs and performing some magic to beautify them is necessary to ensure that they are parsed correctly before being outputted to Elasticsearch.

Data manipulation in Logstash is performed using filter plugins. This article focuses on one of the most popular and useful filter plugins — the Logstash grok filter, which is used to parse unstructured data into structured data.

How does it work?

Put simply, grok is a way to match a line against a regular expression, map specific parts of the line into dedicated fields, and perform actions based on this mapping.

There are many built-in patterns that are supported out-of-the-box by Logstash for filtering items such as words, numbers, and dates (the full list of supported patterns can be found here). If you cannot find the pattern you need, you can write your own custom pattern.

Here is the basic syntax format for a Logstash grok filter:

This will match the predefined pattern and map it to a specific identifying field. Since grok is essentially based upon a combination of regular expressions, you can also create your own regex-based grok filter. For example:

This will match the regular expression of 22-22-22 (or any other digit) to the field name.

A Logstash grok example

To demonstrate how to get started with grokking, I’m going to use the following application log:

The goal I want to accomplish with a grok filter is to break down the log line into the following fields: timestamp, log level, class, and then the rest of the message.

The following grok pattern will do the job:

This will try to match the incoming log to the given pattern. In case of a match, the log will be broken down into the specified fields, according to the defined patterns in the filter. In case of a mismatch, Logstash will add a tag called _grokparsefailure.

In our case, the filter will match and result in the following output:

Manipulating the data

On the base of a match, you can define additional Logstash grok configurations to manipulate the data. For example, you can make Logstash add fields, override fields, or remove fields.

In this case, we are using the ‘overwrite’ action to overwrite the ‘message’ field. This way our ‘message’ field will not appear with the other fields we defined (‘timestamp’, ‘log-level’ and ‘class’). Also, we are using the ‘add_tag’ action to add a custom tag field to the log.

A full list of available actions you can use to manipulate your logs is available here, together with their input type and default value.

The grok debugger

A great way to get started with building your grok filters is this grok debug tool: https://grokdebug.herokuapp.com/

This tool allows you to paste your log message and gradually build the grok pattern while continuously testing the compilation. As a rule, I recommend starting with the %{GREEDYDATA:message} pattern and slowly adding more and more patterns as you proceed.

In the case of the example above, I would start with:

Then, to verify that the first part is working, proceed with:

Common examples

Here are some examples that will help you to familiarize yourself with how to construct a grok filter.

Syslog

Apache access logs

Elasticsearch

Summing it up

Logstash grok is just one type of filter that can be applied to your logs before they are forwarded into Elasticsearch. Because it plays such a crucial part in the logging pipeline, grok is also also one of the most commonly-used filters.

Here is a list of some useful resources that can help you along the grokking way:

Happy grokking!

Logz.io is a predictive, cloud-based log management platform with automated grokking that is built on top of the open-source ELK Stack and can be used for log analysis, application monitoring, business intelligence, and more. Start your free trial today!

Ran Ramati is Customer Success Manager at Logz.io. He is passionate about solving issues with log analytics, skiing, fast cars, beer, and running -- but not all at once.