monitoring gitlab with the elk stack

GitLab is a software development platform that helps organizations manage, develop and deploy git-based code. Growing in usage and popularity, GitLab offers a rich set of development and CI/CD features, such as issue tracking, code reviews, merge handling, and project management.

GitLab allows multiple teams, such as PM, Dev, QA, and Ops, to all collaborate in one application. This leads to multiple users generating activity on a large scale. Issue updates, code commits and merges, and deployment pipeline status — all of this together results in a lot of activity to be monitored.

GitLab comes with some built-in monitoring and visualization capabilities, such as Cycle Analytics and the per-project contributors and repository graphs. Cycle Analytics is especially useful as it enables teams to analyze their efficiency. However, if you want to analyze the data by searching and querying, or if you want to visualize the data yourself, you might find these features somewhat limiting. This article explores a more centralized methodology by integrating with the ELK Stack (Elasticsearch, Logstash and Kibana).

ELK provides powerful log aggregation, analysis and visualization capabilities that used in tandem with GitLab’s extensive logging framework will give organizations an accurate and comprehensive bird’s eye view of the system for monitoring, troubleshooting and analyzing team activity. Using GitLab’s log data, for example, rich dashboards can be created to monitor not only the system’s general health but also specific team metrics, such as the number of commits, issues opened and closed, and so forth. users can benefit from a built-in integration with GitLab and the additional analysis tools provided by the service, but if you’re using your own ELK deployment you’ll be able to set up the described integration as well.  


The steps outlined below presume the following:

  • You have a GitLab Omnibus installation up and running (either the Enterprise or Community edition will do). For instructions on installing GitLab, see here.
  • You have an ELK Stack up and running (either your own ELK deployment or a account). We will be using Filebeat to ship the logs into Elasticsearch, so Logstash is only required if you want to apply advanced parsing to the data.

GitLab logs

As mentioned above, GitLab has an advanced logging framework that ships a variety of different system logs.

Of course, what log data you want to ship is entirely up to you. You can ship all the log data, or you can be a bit more selective. These logs can be pretty verbose, so depending on storage and retention considerations, it’s a good best practice to first understand what logs you need to monitor in the first place.

The Filebeat configurations provided below are designed for shipping the following logs.


This JSON-formatted log records requests sent by GitLab to the Ruby controllers. Here is a sample log:

As you can see, the information in the log includes the request method, the controller, the action performed, the request status, duration, remote IP, and more.

The location of the file will vary according to your installation types. In the case of the GitLab Omnibus packages (recommended installation), the file will reside at:


This is a plain text log file that contains information about all performed requests. It includes the request URL, type, and origin IP as well as the parts of the code that serviced it. The log also provides details on all SQL requests and how long they took. Here is a sample log:

Again, the location of the file varies. In the case of the GitLab Omnibus packages, the file resides at:


A specific, JSON-formatted, file for logging API requests only.



This plain text log file tracks GitLab actions such as adding a new user, creating a new project or group, and so forth. Can act as an audit trail for monitoring user activity.



In any case, I recommend reading GitLab’s excellent documentation to read up on these log files and the information included in them before commencing.

Configuring Filebeat

Filebeat is a log shipper belonging to the Beats family of shippers. Written in Go and extremely lightweight, Filebeat is the easiest and most cost-efficient way of shipping log files into the ELK Stack.

If you haven’t already installed Filebeat, here are some instructions (for Debian):

Open up the Filebeat configuration file at: /etc/filebeat/filebeat.yml:

The following configuration defines the different GitLab files to track and ship into ELK. I’ve defined a prospector for each log type so I can add custom fields to each. Alternatively, I could have defined one prospector for all of the files.

Start Filebeat with:

After a while, a new index will be created and you can define a new index pattern (filebeat-*) in Kibana to begin analyzing the data.


Shipping to

If you are using, a few small modifications need to be applied to establish the logging pipeline.

First, you will need to download an SSL certificate to use encryption:

You can now edit the Filebeat configuration file. If you like, you can make use of the Filebeat wizard to generate the FIlebeat YAML file automatically (available in the Filebeat section, under Log Shipping in the UI).

Either way, the configurations should look something like this:

The main differences are:

  1. specific fields added to each prospector. Replace <yourToken> with your account token (can be found in the UI, under Settings).
  2. The output section defines the listener and the SSL certificate to use.

Once you start (or restart) Filebeat, the GitLab logs will begin to show up in


Interested in monitoring your GitLab environment?

Analyzing the GitLab logs

Now that your logging pipeline is up and running, it’s time to look into the data with some simple analysis operations in Kibana.

Some of the fields can be used to get some visibility into the logs. Adding, for example, the ‘type’ field (the ‘log’ field in case you are using your own ELK), helps give the logs some context.

We can use Kibana queries to search for specific log data. Say, for example, you want to take a look at failed logins into the system. To do this, we would use this combination of a field-level and free-text search:

analyzing logs

Another example could be querying Elasticsearch for error responses for GitLab requests:

Gitlab requests

Using Kibana’s visualization capabilities, you can create a series of simple charts and metric visualizations for giving you a nice overview of your GitLab environment. Here are a few examples.

Visualizing commits

What organization does not want to monitor its team’s productivity? A simple metric visualization will give you a counter on how many commits were performed by your team:


Likewise, we can create a line chart visualization that gives us an overview over time of the commits, per user:

Line Chart

Visualizing issues

In a similar fashion, you can use Kibana to keep track of opened and closed issues. A simple data table visualization gives us a breakdown of the issues opened:

Visualize Issues

A line chart can give us a depiction of how many issues were opened over time:


The list goes on. You can monitor projects created, merges, user activity, CI/CD processes, and more. The logs generated by GitLab include a wealth of information that can be tapped into for monitoring, and adding these visualizations into one Kibana dashboard gives you a nice overview of your environment.

end dashboard


The ELK Stack offers built-in storage, search and visualization features that compliment GitLab’s rich logging capabilities. Using Filebeat, building a logging pipeline for shipping data into ELK is simple. If you want to further process the logs, you might want to consider adding Logstash into your pipeline setup. provides some tools to help you hit the ground running — easy integration steps, as well as the monitoring dashboard above. To install the dashboard, simply search for ‘GitLab’ in ELK Apps and hit the install button.