10 Tips for Using the Logz.io ELK Stack for Log Analysis

10 tips for logz.io

Logz.io provides the ELK Stack as an end-to-end service in the cloud. As part of this service, we are constantly seeking to build and introduce new ways of making logging less of a headache than it traditionally was in the past.  

The list below includes tips for making your log analysis and visualization easier by making the most out of the features we have added to the ELK Stack.   

#1 – Filebeat Wizard 

Filebeat is today one of the most reliable ways to ship data into the ELK Stack. It’s configuration, on the other hand, is not as dependable. Configuring the log shipper is done via a YAML file which is comprised of various different sections. As such, the process of defining your prospectors and output destination can go south fairly quickly in case of bad syntax.

Logz.io’s Log Shipping page includes a Filebeat wizard that can be used to easily produce a ready-made Filebeat configuration file. All you have to do is enter the path of the file you want to track and the log type. The result – a ready-made and working YAML file for shipping into ELK. 

File Beat Wizard

#2 – Proximity Events 

Kibana displays messages being ingested from multiple data sources. It is crucial to being able to understand whether or not there is a relationship between the different messages that are being logged chronologically. 

Take, for example, an Apache 500 response that takes place right after a bad MySQL transaction. These are related log messages that point to the exact root cause but may go unnoticed because they are recorded in two different data streams.  

In Logz.io, you can easily see logs recorded before and after a specific log message, giving you the ability to see the big picture. 

# 3 – ELK Apps  

Kibana is known for its beautiful visualizations and dashboards, but the fine print that many often miss is that constructing these objects is not always easy. Being able to configure accurate aggregations for the X and Y axes of a specific chart can be extremely time-consuming.  

Logz.io includes a library of pre-made Kibana searches, visualizations, and dashboards for different log types called ELK Apps (you can learn more about the collection here) which can be easily installed.  

If you have your own visualization or dashboard that you would like to share, you can contribute them via the UI. In addition, Logz.io has also announced a Kibana Dashboard contest, the winner of which gets a chance to win a free ride to AWS re:Invent 2017. Read here for more details on this. 

ELK Apps

# 4 – Filtering Dashboards 

We are used to treating dashboards as a static monitoring tool, but a nifty trick found in Logz.io can make them more dynamic in nature. Dashboards can be filtered through the fields displayed in them. 

Say for example you are monitoring ELB access logs and are looking at a list of the most time-consuming requests. You can drill down into a specific request by hovering over the URL field and clicking the filter icon. 

Filtering Dashboard

The dashboard will be filtered accordingly. Of course, you can also enter a query in the search field above to filter the dashboard as well.  

# 5 – Sub Accounts 

Logz.io’s pricing model is based upon the volume of data your ship and the required retention period. But since companies consist of different teams with different logging requirements, Logz.io introduced a feature called Sub-Accounts which enables users to run sub-accounts under one single main account, for more efficient account control and management. 

You can define different data volumes and retention periods for different environments under one main account. For example, under the main account, a manager could define one sub-account for development, one for staging, and another for production. 

sub accounts

# 6 – User Tokens 

Sharing dashboards and visualizations is one of the most popular features in Kibana, but there is no established mechanism to make sure that shared Kibana data is safe. That’s why Logz.io has added a method for securing this information with access tokens.  

Using tokens — as opposed to using the regular share URL function in Kibana — will enable you to safely share visualizations and dashboards with people who are not even Logz.io users. 

User Tokens

# 7 – Token Filters 

What if you do not want to share all of the data though? There are a number of reasons you might want to filter the data such as wanting to maintain strict security or not wanting to drown teammates in unnecessary log noise.  

Logz.io enables you to narrow the access granted by a token to a specific field type and value.  

This is done on the Settings page, in the User Tokens section. 

Token Filters

# 8 – Live Tailing Logs 

Querying data indexed in Elasticsearch using queries in Kibana is great, but sometimes, the need arises to ‘tail -f’ your logs to see logs as they are being outputted by the relevant process. For example, say you deployed new code into staging or even production, or you want to reproduce a specific error message.  

Logz.io has added a Live Tail feature that allows you to see new messages in real-time as they are being logged without having to leave Kibana, open a terminal, and manually tail all the relevant files.

Live Tail

# 9 – Exporting Logs 

Not everyone is aware of this option, but in the Discover page in Kibana, you can easily share the log data you are currently analyzing by exporting it as a .CSV file. This option is not Logz.io specific but I thought it was worth mentioning as well. 

Exporting Logs

# 10 – Custom Data Parsing 

Logz.io provides automatic parsing for the most common log types, but sometimes your logs may not comply with the standard structure. For example, they may include additional fields or fields that vary in format. For example, application logs will most definitely be unique in format and structure. 

If these examples apply to your log data, you can use our Data Parsing feature to define your own custom parsing method for your logs using an easy-to-use, dedicated wizard. 

The wizard is accessed from the Log Shipping page (Log Shipping → Data Parsing) in Logz.io and includes four steps: defining sample logs for testing the new parsing, selecting and configuring a parsing method, and customizing specific field settings.

Data Parsing

Bonus – Chat Support! 

Last but not least, and as a bonus tip – in case of any question whatsoever, Logz.io gives you the option to engage with our amazing Support team by using the chat button in the bottom right corner the page.  

Don’t be shy! 

Chat Support

We are constantly learning from our users about new ways they are using our platform. If you have a tip, best practice or even an idea for a new feature, please share either in the comments below or on the Logz.io community forum.

Observability at scale, powered by open source


2022 Gartner® Magic Quadrant for Application Performance Monitoring and Observability
Forrester Observability Snapshot.

Organize Your Kubernetes Logs On One Unified SaaS Platform

Learn More