$ sudo initctl start honeytail
$ sudo systemctl start honeytail
$ honeytail -c /etc/honeytail/honeytail.conf
Our connector pulls your PostgreSQL logs into Honeycomb for analysis, so you can finally get a quick handle on the database queries triggered by your application logic. It surfaces attributes like:
Honeycomb is unique in its ability to calculate metrics and statistics on the fly, while retaining the full-resolution log lines (and the original query that started it all!).
Note: This document is for folks running PostgreSQL directly. If you’re running PostgreSQL on RDS, check out our RDS connector page to set up your RDS instance instead.
The agent you’ll use to translate logs to events and send them to Honeycomb is called honeytail
.
Before running honeytail
, you’ll want to turn slow query logging on for all queries if possible. To turn on slow query logging, edit your postgresql.conf
and set
log_min_duration_statement = 0
log_statement='none'
Note: log_statement
indicates which types of queries are logged, but is superseded when setting log_min_duration_statement
to 0
, as this effectively logs all queries. Setting log_statement
to any other value will change the format of the query logs in a way that isn’t currently supported by the Honeycomb PostgreSQL parser.
Alternatively, you can set this from the psql
shell by running
ALTER SYSTEM SET log_min_duration_statement=0;
ALTER SYSTEM SET log_statement='none';
SELECT pg_reload_conf();
Finally, take note of the value of the log_line_prefix
config line. It’ll
look something like this:
log_line_prefix = '%t [%p-%l] %q%u@%d '
On your PostgreSQL host, download and install the latest honeytail
by running:
# Download and install the AMD64 debian package
wget -q https://honeycomb.io/download/honeytail/v1.3.0/honeytail_1.3.0_amd64.deb && \
echo '7962e1d5c751e8cb3d465d90a0336582edbc14cb19ce6afc869dde349e4cbbfd honeytail_1.3.0_amd64.deb' | sha256sum -c && \
sudo dpkg -i honeytail_1.3.0_amd64.deb
# Download and install the ARM64 debian package
wget -q https://honeycomb.io/download/honeytail/v1.3.0/honeytail_1.3.0_arm64.deb && \
echo '481c1b385c2df2e3e23e224776b1585fcca5afbe604d0e6ca4d1b68b2bcc973e honeytail_1.3.0_arm64.deb' | sha256sum -c && \
sudo dpkg -i honeytail_1.3.0_arm64.deb
# Download and install the rpm package
wget -q https://honeycomb.io/download/honeytail/v1.3.0/honeytail-1.3.0-1.x86_64.rpm && \
echo 'da6eb828654fefdd2d3059df243adc270546d94fac7823b497eaa89c3bbd4106 honeytail-1.3.0-1.x86_64.rpm' | sha256sum -c && \
sudo rpm -i honeytail-1.3.0-1.x86_64.rpm
wget -q -O honeytail https://honeycomb.io/download/honeytail/v1.3.0/honeytail-linux-amd64 && \
echo '96e043fbc6350923f3c6db3e978e8471020f27c44a8bfad0cb0f6ac8cfb0d877 honeytail' | sha256sum -c && \
chmod 755 ./honeytail
wget -q -O honeytail https://honeycomb.io/download/honeytail/v1.3.0/honeytail-linux-arm64 && \
echo 'f56b9d77a997c1b891c6bd4a1213fb203ffefc8319cca4460644cc0ae2983728 honeytail' | sha256sum -c && \
chmod 755 ./honeytail
wget -q -O honeytail https://honeycomb.io/download/honeytail/v1.3.0/honeytail-darwin-amd64 && \
echo '30a4f34939122d5b685261a79c18a3b2b3dc50b8cbe431acc523fa43eccadfd1 honeytail' | shasum -a 256 -c && \
chmod 755 ./honeytail
# Build from latest source after setting up go
git clone https://github.com/honeycombio/honeytail
cd honeytail; go install
The packages install honeytail
, its config file /etc/honeytail/honeytail.conf
,
and some start scripts. Build honeytail
from source if you need it in an unpackaged form or for ad-hoc use.
Make sure you’ve enabled query logging before running honeytail
.
To consume the current slow query log from the beginning, run:
honeytail \
--writekey=YOUR_API_KEY \
--dataset=postgres-queries --parser=postgresql \
--postgresql.log_line_prefix=YOUR_LOG_LINE_PREFIX \
--file=/var/log/postgresql/postgresql-9.5-main.log \
--tail.read_from=beginning
First, check out honeytail
Troubleshooting for general debugging tips.
No data is being sent, and --debug
doesn’t seem to show anything useful
Take a look at the --file
being handed to honeytail
and make sure it contains PostgreSQL query statements. An example excerpt from a PostgreSQL log file might look like:
2017-11-10 23:24:01 UTC [1998-1] LOG: autovacuum launcher started
2017-11-10 23:24:01 UTC [2000-1] [unknown]@[unknown] LOG: incomplete startup packet
2017-11-10 23:24:02 UTC [2003-1] postgres@postgres LOG: duration: 4.356 ms statement: SELECT d.datname as "Name",
pg_catalog.pg_get_userbyid(d.datdba) as "Owner",
pg_catalog.pg_encoding_to_char(d.encoding) as "Encoding",
d.datcollate as "Collate",
d.datctype as "Ctype",
pg_catalog.array_to_string(d.datacl, E'\n') AS "Access privileges"
FROM pg_catalog.pg_database d
ORDER BY 1;
Also check that the value you’re passing in the --postgresql.log_line_prefix
flag matches PostgreSQL’s configured value, which you can find using SHOW log_line_prefix
at a psql
prompt:
# SHOW log_line_prefix;
log_line_prefix
---------------------
%t [%p-%l] %q%u@%d
If your log file looks like a normal PostgreSQL output log but honeytail
is still failing to send events to Honeycomb, let us know! We’re available to help anytime via email or chat
.
To run honeytail
continuously as a daemon process, first modify the config file /etc/honeytail/honeytail.conf
and uncomment and set:
ParserName
to postgresql
WriteKey
to your API key, available from the account pageLogFiles
to the path for your PostgreSQL log file.Dataset
to the name of the dataset you wish to create with this log file.Then start honeytail
using upstart
or systemd
:
$ sudo initctl start honeytail
$ sudo systemctl start honeytail
$ honeytail -c /etc/honeytail/honeytail.conf
You may have archived logs that you’d like to import into Honeycomb. If you have a log file located at /var/log/postgresql/postgresql-main.log
, you can backfill using this command:
honeytail \
--writekey=YOUR_API_KEY \
--dataset=PostgreSQL \
--parser=postgresql \
--file=/var/log/postgresql/postgresql-main.log \
--postgresql.log_line_prefix=YOUR_CONFIGURED_LOG_LINE_PREFIX \
--backfill
This command can be used at any point to backfill from archived log files. You can read more about honeytail
’s backfill behavior here.
Note: honeytail
does not unzip log files, so you’ll need to do this before backfilling.
Once you’ve finished backfilling your old logs, we recommend transitioning to the default streaming behavior to stream live logs to Honeycomb.
While we believe strongly in the value of being able to track down the precise query causing a problem, we understand the concerns of exporting log data which may contain sensitive user information.
With that in mind, we recommend using honeytail
’s PostgreSQL parser, but adding a --scrub_field=query
flag to hash the concrete query
value. The normalized_query
attribute will still be representative of the shape of the query, and identifying patterns including specific queries will still be possible—but the sensitive information will be completely obscured before leaving your servers.
More information about dropping or scrubbing sensitive fields can be found here.
Honeytail and our installers are all open source, Apache 2.0 licensed. Their source can be found on GitHub: