Thanks to Logstash’s flexible plugin architecture, you can send a copy of all the traffic that Logstash is processing to Honeycomb. This topic explains how to use Logstash plugins to convert incoming log data into events and then send them to Honeycomb.Documentation Index
Fetch the complete documentation index at: https://docs.honeycomb.io/llms.txt
Use this file to discover all available pages before exploring further.
Data Format Requirements
To process the log data coming into Logstash into Honeycomb events, you can use Logstash filter plugins. These filter plugins transform the data into top-level keys based on the original source of the data. We have found these to be especially useful:- grok matches regular expressions and has configs for many common patterns (such as the apache, nginx, or haproxy log format).
- json matches JSON-encoded strings and breaks them up in to individual fields.
- kv matches
key=valuepatterns and breaks them out into individual fields.
Example: Using Logstash Filter Plugins to Process Haproxy Logs for Honeycomb Ingestion
Let us say you are sending haproxy logs (in HTTP mode) to Logstash. A log line describing an individual request looks something like this (borrowed from the haproxy config manual):message field, so in the filter parameter of the logstash.yaml config fragment below, we use the grok filter plugin and tell it to parse the message and make all the content available in top-level fields.
And, since we do not need it anymore, we tell grok to remove the message field.
The mutate filter plugin takes the numeric fields extracted by haproxy and turns them into integers so that Honeycomb can do math on them (later).
Sending Data to Honeycomb
Now that all the fields in themessage are nicely extracted into events, send them on to Honeycomb!
To send events, configure an output plugin.
You can use Logstash’s HTTP output plugin to craft HTTP requests to the Honeycomb API.
This configuration example sends the data to a dataset called “logstash.”
- Use
filterto nest the Logstash JSON fields under adataelement in the JSON payload to Honeycomb. This filter is required for Honeycomb to ingest your Logstash logs. Learn more aboutfilterin the Elastic documentation. - Specify a URL (
url) to send the data to:- for our US instance:
https://api.honeycomb.io/1/batch/<dataset_name> - for our EU instance:
https://api.eu1.honeycomb.io/1/batch/<dataset_name>
- for our US instance:
- Add your Honeycomb API key to
"X-Honeycomb-Team"so that Logstash is authorized to send data to Honeycomb. - Specify the output format as JSON batch (
json_batch). - Specify the use of HTTP compression (
http_compression => true).
Set Event Timestamps
In Logstash, each event has a special@timestamp field.
In general, use the date filter plugin to extract the time attribute from log lines.
For example, if you have a JSON log line containing timestamps in the format: