Installation
Links
Initialization
Initialize the library by passing in your Team API key and the default dataset name to which it should send events.Working With Proxies
Using a proxy requires overriding the defaultTransmission implementation when initializing libhoney:
proxies map passed in is documented in the requests documentation.
Note that if you override transmission_impl, if you also have non-default values for options such as max_concurrent_batches and max_batch_size in libhoney.init, they will need to be specified in the new Transmission object.
Further configuration options can be found in the init.py file.
Building And Sending Events
Once initialized,libhoney is ready to send events.
Events go through three phases:
.send(), the event is dispatched to be sent to Honeycomb.
Events are queued to be transmitted asynchronously, allowing your application to continue without delay.
The queue is limited in size, however, and when creating events faster than they can be sent, overflowed events will be dropped instead of backing up and slowing down your application.
This behavior can be configured at initialization by adjusting block_on_send, as described here.
In its simplest form, you can add a single field to an event with the libhoney.add_field(k, v) method.
If you add the same key multiple times, only the last value added will be kept.
More complex structures (dicts and objects—things that can be serialized into a JSON object)
can be added to an event with the .add(data) method.
Events can have metadata associated with them that is not sent to Honeycomb.
This metadata is used to identify the event when processing the response.
More detail about metadata is below in the Response section.
Handling Responses
Sending an event is an asynchronous action and will avoid blocking by default..send() will enqueue the event to be batched and sent as soon as possible (thus, the return value doesn’t indicate that the event was successfully sent).
You can monitor the queue returned by .responses() to check whether events were successfully received by Honeycomb’s servers.
The responses queue will receive responses for each batch of events sent to the Honeycomb API.
Before sending an event, you have the option to attach metadata to that event.
This metadata is not sent to Honeycomb; instead, it is used to help you match up individual responses with sent events.
When sending an event, libhoney will take the metadata from the event and attach it to the response object for you to consume.
Add metadata by calling .add_metadata({"key": "value"}) on an event.
Responses are represented as dicts with the following keys:
- metadata: the metadata you attached to the event to which this response corresponds
- status_code: the HTTP status code returned by Honeycomb when trying to send the event.
2xxindicates success. - duration: the number of milliseconds it took to send the event.
- body: the body of the HTTP response from Honeycomb. On failures, this body contains some more information about the failure.
- error: when the event does not even get to create a HTTP attempt, the reason will be in this field. For example, when sampled or dropped because of a queue overflow.
What to Send?
Honeycomb events are composed of fields of four basic types:string, int, float, and bool.
You should send any data that will help you provide context to an event in your application, such as user ids, timers, measurements, system info, build ID, and metadata.
For more guidance about instrumenting your application, check our instrumentation introduction guide.
Examples
Simple: Send a Blob of Data
Intermediate: Override Some Attributes
Middleware Examples: Django
Django is a widely-used, high-level Python Web framework. Each inbound HTTP request as received by a framework like Django maps nicely to Honeycomb events, representing “a single thing of interest that happened” in a given system. Django middlewares are simply classes that have access to the request object (and optionally the response object) in the application’s request-response chain. As such, you can define a simplehoney_middleware.py file as in the following:
examples/ directory on GitHub for more sample code demonstrating how to use events, builders, fields, and dynamic fields, specifically in the context of Django middleware.
Advanced Usage: Global Fields
Some fields are interesting to have on every event: the build ID of the application, the application server’s hostname, or any fields you would like. Rather than remembering to create these for each event you create, you can add them globally at any time in your application.Advanced Usage: Utilizing Builders
Global fields can be useful, but what if you want to propagate common fields that are only relevant to one component of your application? Builders are objects that generate new events, but that also propagate any fields added to it to the created events. You can also clone builders—the cloned builder will have a copy of all the fields and dynamic fields in the original. As your application forks down into more and more specific functionality, you can create more detailed builders. The final event creation in the leaves of your application’s tree will have all the data you have added along the way in addition to the specifics of this event. The global scope is essentially a specialized builder, for capturing attributes that are likely useful to all events (for example, hostname or environment). Adding this kind of peripheral and normally unavailable information to every event gives you enormous power to identify patterns that would otherwise be invisible in the context of a single request.Advanced Usage: Dynamic Fields
The top-levellibhoney and Builders support .add_dynamic_field(func).
Adding a dynamic field to a Builder or top-level libhoney ensures that each time an event is created, the provided function is executed and the returned value is added to the event.
The key is the __name__ attribute of the function.
This may be useful for including dynamic process information such as memory used, number of threads, concurrent requests, and so on to each event.
Adding this kind of dynamic data to an event makes it easy to understand the application’s context when looking at an individual event or error condition.
Advanced Usage: Responses Queue
If you would like to view the API response for events you have sent, if you need to verify receipt of an event, you can use the responses queue to do so:block_on_response=True when initializing the SDK with libhoney.init() - this will prevent dropped responses, but will also prevent new batches from being sent if the queue is not being read from.
To continuously process the responses queue, you can run your processing function in another thread.
Customizing Event Transmission
By default, events are sent to the Honeycomb API. It is possible to override the default transmission implementation by specifyingtransmission_impl to init.
A couple of alternative implementations ship with the SDK.
- FileTransmission: writes events to a file handle, defaulting to stderr
- TornadoTransmission: sends events using Tornado’s AsyncHTTPClient rather than using a thread pool
Flushing Events
The Python SDK has no mechanism for sending an individual event immediately - events are enqueued and sent asynchronously in batches. You can, however, trigger a flush of the send queue. There are two ways to do this..flush() instructs the transmission to send all of its events and will block until all events in the queue have been sent (or an attempt has been made - see the responses queue if you need to verify success).
.close() flushes all events, but prevents transmission of new events.
This is best called as part of your application’s shutdown logic to ensure that all events are sent before the program terminates.
Troubleshooting
Refer to Common Issues with Sending Data in Honeycomb.Contributions
Features, bug fixes and other changes tolibhoney are gladly accepted.
Please open issues or a pull request with your change via GitHub.
Remember to add your name to the CONTRIBUTORS file!
All contributions will be released under the Apache License 2.0.