kibana-tut

Kibana is the visualization layer of the ELK Stack, which is the most popular log analytics platform and is built on top of Elasticsearch, Logstash, and Kibana. This tutorial will cover both the simple and advanced features of Kibana over a few parts. This first part will explain how to run searches in Kibana using the Lucene query syntax.

Loading Data in Kibana

For the purpose of this tutorial, we will use a sample 24-hour period of Apache data (which can be downloaded here) that is being renewed every day. (This means that if you import the data into your ELK Stack, it will present data for today. Then, you can download additional data at that same URL the following day.)

You can use your own ELK Stack to run this tutorial, but for the sake of simplicity, we will use our Logz.io ELK as a service in this example.

To upload your data, take these steps:

  1. If you do not already have a Logz.io account, open one here
  2. Download the sample file from http://logz.io/sample-data
  3. Upload the data using the file upload method found in the Log Shipping tab.* Note that this is a simple cURL command:
    curl -T <Full path to file> http://listener.logz.io:8021/file_upload/<Token>/apache_access
  4. It should take about a minute for the file to upload and be visible in the Kibana Discovery tab. If the data is not visible, try refreshing after a minute
  5. Open one of the log lines and click on the Refresh button to refresh the Kibana mapping:kibana tutorial screenshot
  6. That’s it! You’re all done, and you will now have some data in your Kibana

* The token can be found on the settings page and the type of the file is apache_access

Kibana Search Syntax

This section will detail some simple searches that one can perform.

Free-Text Search

Free text search works within all fields — including the _source field, which includes all the other fields. If no specific field is indicated in the search, the search will be done on all of the fields that are being analyzed.

Try to run the following searches in the Discovery search field and see what you get (and set the time parameter on the top right of the dashboard to the prior twelve hours to capture more data):

  • category
  • Category
  • categ
  • cat*
  • categ?ry
  • “category”
  • category\/health
  • “category/health”
  • Chrome
  • chorm*

There are a few things to notice here:

  1. Text searches are not case sensitive. This means that [category] and [CaTeGory] will return the same results. When you put the text within double quotes (“”), you are looking for an exact match, which means that the exact string must match what is inside the double quotes. This is why [category\/health] and [“category/health”] will return different results
  2. You can use the wildcard symbols [*] or [?] in searches. [*] means any number of characters, and [?] means only one character

Field-Level Searches

In Kibana, you can search for data inside specific fields. To do that, you need to use the following format:

<fieldname>:search

<fieldname>:search

Run the following searches to see what you get (some will return no results):

  • geoip.country_name:Canada
  • name:chrome
  • name:Chrome
  • name:Chr*
  • response:200
  • bytes:65
  • bytes:[65 TO *]
  • bytes:[65 TO 99]
  • bytes:{65 TO 99}
  • _exists_:name

There are a few things to notice here:

  1. Field-level searches depend on the type of field. In the Logz.io Kibana visualization, all fields are not analyzed by default, which means that searches are case-sensitive and cannot use wildcard searches. The reason we save all of the fields as “not analyzed” is to save space in the index because the data is also duplicated in an analyzed field called _source
  2. You can search a range within a field. If you use brackets [], this means that the results are inclusive. If you use {}, this means that the results are exclusive
  3. Using the _exists_ prefix for a field will search the documents to see if the field exists
  4. When using a range, you need to follow a very strict format and use capital letters TO to specify the range

Start Free Trial

Logical Statements

You can use logical statements in searches in these ways:

  • USA AND Firefox
  • USA OR Firefox
  • (USA AND Firefox) OR Windows
  • -USA
  • !USA
  • +USA
  • NOT USA

There are a few things to understand here:

  1. You need to make sure that you use the proper format such as capital letters to define logical terms like AND or OR
  2. You can use parentheses to define complex, logical statements
  3. You can use -,! and NOT to define negative terms

Escaping special characters

All special characters need to be properly escaped. The following is a list of all available special characters:

+ – && || ! ( ) { } [ ] ^ ” ~ * ? : \

Advanced Searches

Proximity searches

Proximity searches are an advanced feature of Kibana that takes advantage of the Lucene query language.

Using a proximity search

  • [categovi~2] means a search for all the terms that are within two changes from [categovi]. (This means that all category will be matched)
  • Proximity searches use a lot of system resources and often trigger internal circuit breakers in Elasticsearch. If you try something such as [catefujt~10], it is likely not to return any results due to the amount of memory that us used to perform that specific search

Bonus! Build a Kibana Dashboard with One Click

In the next part of our Kibana tutorial, we will talk about how to take these searches to the next level and build visualizations. In the meantime, if you go to our ELK Apps library and search for Apache apps, you will find a pre-made dashboard that will give you all of the information that you need to monitor Apache log data. To use that dashboard, just click on the Install button and then the Open button.

What’s Next?

I hope you found this first part interesting and informational! If you have any suggestions on what else should be included in the first part of this Kibana tutorial, please let me know in the comments below.

Logz.io is a predictive, cloud-based log management platform that is built on top of the open-source ELK Stack and can be used for log analysis, application monitoring, business intelligence, and more. Start your free trial today!

Asaf Yigal is co-founder and VP Product at Logz.io. Prior to Logz.io, Asaf co-founded Currensee, a social-trading platform, which was later acquired by OANDA in 2013. Prior to Currensee, Asaf played executive roles at Akorri in developing an end-to-end performance monitoring platform and at Onaro in developing a storage resource management platform. Both Akorri and Onaro were acquired by NetApp. Prior to Onaro, Asaf headed a research team in the Israeli Navy, taking an artificial intelligence system to military deployment. Asaf holds a B.S. from the Technion and is an Instrument-rated private pilot.