Introducing On-Demand Logging with Logz.io Drop Filters

drop filters

Logs need to be stored. In some cases, for a long period of time. Whether you’re using your own infrastructure or a cloud-based solution, this means that at some stage you’ll be getting a worried email from your CFO or CPO asking you to take a close look at your logging architecture. This, in turn, will push you to limit some data pipelines and maybe even totally shut off others. Maybe we don’t need those debug logs after all, right? 

Wrong. 

Logs, just like any other commodity, change in value. Sure, some logs might not be important 100% of the time. But the last thing you need when troubleshooting an issue in production is that single message holding a critical piece of the puzzle not available. The result of compromising over what data to log, because of costs, is a dent in your system’s observability. 

That’s why we’re happy to announce the availability of a new feature called Drop Filters that allows you to ship all the logs you want in a cost-efficient manner. 

We call this On-Demand Logging.

Ingesting ≠ indexing

As the name implies, Drop Filters allows you to define what logs to “drop”. This means you can decide what specific logs you don’t want to be stored and indexed by Logz.io. 

You can still keep the log shipping pipelines up and running. The logs simply won’t be stored and therefore will not be held against your overall logging quota and you will not be charged for them. 

If you’ve got archiving set up, the logs will continue to be stored on your Amazon S3 bucket and are available to be ingested into Logz.io when necessary, so you’re not compromising on your system’s observability. 

Ship it all but don’t pay for it all!

Granular & dynamic filtering

You can decide exactly what logs to drop using a new page in the UI. Open the page by clicking the cogwheel in the top-right corner of the page and selecting Tools → Drop filter

drop filter

To begin, simply click the + Add drop filter button:

filter add fields

As a first step, you can select to filter a specific log type or choose to filter all your logs. Then, you can select a specific field and corresponding value to filter the selected log type by. 

In the example below, I’m asking Logz.io to drop Apache access logs with a 200 response:

apache access

That’s all there is to it. Select the confirmation checkbox and hit the Apply the filter button to create the filter: 

apply filter

Logz.io will immediately stop indexing any logs according to the filtering rule you set up. You can toggle this rule on and off as required (i.e. On-Demand Logging) using the control button displayed on the rule or delete it completely. You can create up to 10 drop filters.

Dropping logs with Drop Filters does not change your log shipping pipelines. However, keep in mind that since dropped logs are not stored by Logz.io they cannot be searched or used to trigger alerts.

Log without limits

A lot of pain in the world of log management stems from the ever-increasing amount of operational noise created by logs. This noise poses an analytics challenge — how does one sift through millions of log messages a day — but also a very real cost challenge. Data storage can cost organizations millions a year. 

Logz.io invests a lot of time and resources into helping our users overcome these two challenges. Insights™ was developed to reveal hidden issues hiding within the data and cut troubleshooting time. In addition, a series of cost optimization features, such as Data Optimizer™ and Volume Analysis were developed to help build cost-efficient and optimized data pipelines. Drop Filters complements these features by allowing you to log without limits. 

Drop Filters is available in both our Pro and Enterprise plans.
Artboard Created with Sketch.

Leave a Reply

Your email address will not be published. Required fields are marked *

×

Turn machine data into actionable insights with ELK as a Service

By submitting this form, you are accepting our Terms of Use and our Privacy Policy

×

DevOps News and Tips to your inbox

We write about DevOps. Log Analytics, Elasticsearch and much more!

By submitting this form, you are accepting our Terms of Use and our Privacy Policy