Kibana Visualization How-to’s: Heatmaps

Kibana-Heatmap

In Kibana you have a full selection of graphical representations for your data, most of the time this can be a simple line or bar charts to do what you need to do. But every so often you need to take a different view to get the most out of your data. Heatmaps are a critical component of the Kibana visualization arsenal, and deserve their own attention.

What are heatmaps?

A heatmap is a type of visualization that uses color to display the magnitude of the data you are trying to represent. They’re used for all sorts of data and in several different types. For most of us in the developer world, the one that we’ve come across the most is probably the GitHub Commit Graph.

Kibana Heatmaps

Kibana Heatmaps

As you can see, the legend tells you how intense a block is, so from grey to mean that those blocks have no or little data to the very dark green to represent the most data.

How does this work in Kibana?

Now, a Kibana heatmap is perfect for visualizing when, where or even imply why certain events occur. For most of using Kibana we’re reviewing, digesting, or trying to interpret log data, so how does this help us? With a heatmap, if we have logs with different types or some form of grouping tag we can easily identify the type of log that is most prevalent.

A Kibana Heatmap Walkthrough

From here we’re going to take a little bit of a practical walkthrough so you can see the data and the basics of how to interpret it.

Data for Visualizations

I’m going to be using simple structured log data, with a log level and message that’s enriched on ingestion. Here are a couple of log examples:

Info Log Example

{
  "_index": "logzioCustomerIndex201001_v2",
  "_type": "doc",
  "_id": "AXTkugA9M8h7DWSdWMmW.account-120084",
  "_version": 1,
  "_score": null,
  "_source": {
    "level": "info",
    "message": "Get GIFs from Giphy Trending API",
    "type": "nodejs",
    "tags": [
      "_logz_http_bulk_json_8070"
    ],
    "@timestamp": "2020-10-01T15:13:49.044Z",
    "_logzio_pattern": 4620484
  },
  "fields": {
    "@timestamp": [
      "2020-10-01T15:13:49.044Z"
    ]
  },
  "highlight": {
    "type": [
      "@kibana-highlighted-field@nodejs@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1601565229044
  ]
}

Error Log Example

{
  "_index": "logzioCustomerIndex201001_v2",
  "_type": "doc",
  "_id": "AXTkugoZpKbgGHjdTi4F.account-120084",
  "_version": 1,
  "_score": null,
  "_source": {
    "stack": "AccessDenied: Access Denied\n    at Request.extractError (/var/task/node_modules/aws-sdk/lib/services/s3.js:837:35)\n    at Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:106:20)\n    at Request.emit (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:78:10)\n    at Request.emit (/var/task/node_modules/aws-sdk/lib/request.js:688:14)\n    at Request.transition (/var/task/node_modules/aws-sdk/lib/request.js:22:10)\n    at AcceptorStateMachine.runTo (/var/task/node_modules/aws-sdk/lib/state_machine.js:14:12)\n    at /var/task/node_modules/aws-sdk/lib/state_machine.js:26:10\n    at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:38:9)\n    at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:690:12)\n    at Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:116:18)\n    at Request.emit (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:78:10)\n    at Request.emit (/var/task/node_modules/aws-sdk/lib/request.js:688:14)\n    at Request.transition (/var/task/node_modules/aws-sdk/lib/request.js:22:10)\n    at AcceptorStateMachine.runTo (/var/task/node_modules/aws-sdk/lib/state_machine.js:14:12)\n    at /var/task/node_modules/aws-sdk/lib/state_machine.js:26:10\n    at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:38:9)\n    at Request.<anonymous> (/var/task/node_modules/aws-sdk/lib/request.js:690:12)\n    at Request.callListeners (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:116:18)\n    at callNextListener (/var/task/node_modules/aws-sdk/lib/sequential_executor.js:96:12)\n    at IncomingMessage.onEnd (/var/task/node_modules/aws-sdk/lib/event_listeners.js:313:13)\n    at IncomingMessage.emit (events.js:327:22)\n    at IncomingMessage.EventEmitter.emit (domain.js:483:12)",
    "level": "error",
    "_logzio_logceptions": [
      "d034b140f728a261d00f042abd033f44"
    ],
    "message": "Access Denied",
    "type": "nodejs",
    "tags": [
      "_logz_http_bulk_json_8070"
    ],
    "@timestamp": "2020-10-01T15:08:52.924Z",
    "_logzio_pattern": 4623730
  },
  "fields": {
    "@timestamp": [
      "2020-10-01T15:08:52.924Z"
    ]
  },
  "highlight": {
    "type": [
      "@kibana-highlighted-field@nodejs@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1601564932924
  ]
}

You can quickly see how different the data is, and due to it being a long string of information for the messages we can’t infer much from the messages without diving into each individual log.

Setting Up the Heatmap Visual

First things first you need to head to the Visualizations tab, which you can find here

Visualizations Tab

Visualizations Tab

And from here to ⊕ Create new visualization:

And then select Heat Map

From here you’ll see with the source you want the Kibana visualization to be built from, so search till you have the search or pattern you desire.

At this point you’ll have a heatmap with a single hot spot, with both the X and Y axes representing every document available in the time span you’ve selected within your given source. So now we need to define our access, our visualization format, and any filtering.

For me, I’m going to start with filtering so that I’m reduced to the log items I know have the groupings I desire to visualize. So to do that I use the standard Kibana method of +Add filter and define based on the given fields.

Next, I want to define my X axis to break this down into something I can interpret. I find it easiest to use the Date Histogram as the Aggregation type using the @timestamp field generated at insertion, leaving the interval at Auto.

To apply the Y-axis, you will need to select the Add sub-buckets button.

To apply the Y-axis, you will need to select the Add sub-buckets button.

As I want to use the log level field to group the data by I’ll need to select the Termssub aggregation so that we can do string match operations.

As I want to use the log level field to group the data by I’ll need to select the Term Sub aggregation so that we can do string match operations.

Once we’ve selected the field to be level and applied the changed we get a quick representation of the number of logs coming in of different types, including where no logs of that type exist.

Once we’ve selected the field to be level and applied the changed we get a quick representation of the number of logs coming in of different types, including where no logs of that type exist.

However, this view doesn’t grant us any granularity, as only four stages of the legend exist, which with the logs being so heavily info orientated we don’t get to see when during the day the errors are appearing. For this, we need to tweak the graph options.

In this menu, I quickly changed the color scheme from green to red (to make it more interesting). More importantly, I changed the number of colors to 10 (the maximum you can apply). It allows more stages of granularity, so now you can see which 30-minute block had the most info logs or errors.

More Kibana Tutorials

This was just a quick run-through of how to generate and customize a Kibana heatmap visualization for logs. Hopefully, this has been helpful to you, and you’ll share some of your interesting Kibana visualizations and the advanced dashboards you create with them. If you want more info on creating Kibana visuals, check out our tutorial on custom visualizations and summary of plugins for more graph styles in Kibana.

Get started for free

Completely free for 14 days, no strings attached.