Honeycomb AWS Integrations can be set up with HashiCorp Terraform as your preferred Infrastructure-as-Code (IaC) method and deployment process.
Honeycomb AWS Integrations utilize AWS CloudWatch, Amazon Kinesis, and AWS Lambda to send data to Honeycomb.
Refer to How AWS Integrations Work to reference which AWS services use which methods.
Note that standard AWS charges apply.
Please refer to Amazon for specifics on associated egress costs.
Honeycomb provides a Terraform module to automate configuration of various AWS services to Honeycomb.
Each integration has an independently deployable submodule that the top-level module deploys all together with the ability to turn functionality on and off as needed.
Supported Terraform Integrations:
- CloudWatch Logs
- CloudWatch Metrics
- RDS Logs
- Amazon S3 Bucket Logs
Implement all of the Terraform submodules at once, or choose among the available Terraform submodules.
All
CloudWatch Logs
CloudWatch Metrics
RDS Logs
Amazon S3 Bucket Logs
To configure for all supported Terraform integrations:
-
Add the minimal Terraform configuration, which includes the required fields for all supported Terraform integrations:
module "honeycomb-aws-integrations" {
source = "honeycombio/integrations/aws"
# aws cloudwatch logs integration
cloudwatch_log_groups = [module.log_group.cloudwatch_log_group_name] // CloudWatch Log Group names to stream to Honeycomb.
# aws rds logs integration
enable_rds_logs = true
rds_db_name = var.db_name
rds_db_engine = "mysql"
rds_db_log_types = ["slowquery"] // valid types include general, slowquery, error, and audit (audit will be unstructured)
# aws metrics integration - pro/enterprise Honeycomb teams only
# enable_cloudwatch_metrics = true
# s3 logfile - alb access logs
s3_bucket_arn = var.s3_bucket_arn
s3_parser_type = "alb" // valid types are alb, elb, cloudfront, vpc-flow-log, s3-access, json, and keyval
#honeycomb
honeycomb_api_key = var.honeycomb_api_key // Honeycomb API key.
honeycomb_dataset = "terraform-aws-integrations-test" // Your Honeycomb dataset name that will receive the logs.
# Users generally do not need to set this, but it may be necessary when working with a proxy like Honeycomb's Refinery.
honeycomb_api_host = var.honeycomb_api_host
}
-
Set the
TF_VAR_HONEYCOMB_API_KEY environment variable to your team’s Honeycomb API Key.
export TF_VAR_HONEYCOMB_API_KEY=$HONEYCOMB_API_KEY
-
Set the environment variables with your AWS credentials.
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION
For more details and options, visit Terraform documentation.
-
Run
terraform plan/apply in sequence.
For more configuration options, refer to USAGE.md. To configure for CloudWatch Logs:
-
Add the minimal Terraform configuration, which includes the required fields:
module "honeycomb-aws-cloudwatch-logs-integration" {
source = "honeycombio/integrations/aws//modules/cloudwatch-logs"
name = var.cloudwatch_logs_integration_name // A name for the Integration.
#aws cloudwatch integration
cloudwatch_log_groups = ["/aws/lambda/S3LambdaHandler-test"] // CloudWatch Log Group names to stream to Honeycomb.
s3_failure_bucket_arn = var.s3_bucket_name
// S3 bucket ARN that will store any logs that failed to be sent to Honeycomb.
#honeycomb
honeycomb_api_key = var.HONEYCOMB_API_KEY // Honeycomb API key.
honeycomb_dataset_name = "cloudwatch-logs" // Your Honeycomb dataset name that will receive the logs.
}
-
Set the
TF_VAR_HONEYCOMB_API_KEY environment variable to your team’s Honeycomb API Key.
export TF_VAR_HONEYCOMB_API_KEY=$HONEYCOMB_API_KEY
-
Set the environment variables with your AWS credentials.
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION
For more details and options, visit Terraform documentation.
-
Run
terraform plan/apply in sequence.
For more configuration options, refer to USAGE.md. To configure for CloudWatch Metrics:
-
Add the minimal Terraform configuration, which includes the required fields:
module "honeycomb-aws-cloudwatch-metrics-integration" {
source = "honeycombio/integrations/aws//modules/cloudwatch-metrics"
name = var.cloudwatch_metrics_integration_name // A name for the Integration.
honeycomb_api_key = var.HONEYCOMB_API_KEY // Honeycomb API key.
honeycomb_dataset_name = "cloudwatch-metrics" // Your Honeycomb dataset name that will receive the metrics.
s3_failure_bucket_arn = var.s3_bucket_arn // A S3 bucket that will store any metrics that failed to be sent to Honeycomb.
}
-
Set the
TF_VAR_HONEYCOMB_API_KEY environment variable to your team’s Honeycomb API Key.
export TF_VAR_HONEYCOMB_API_KEY=$HONEYCOMB_API_KEY
-
Set the environment variables with your AWS credentials.
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION
For more details and options, visit Terraform documentation.
-
Run
terraform plan/apply in sequence.
For more configuration options, refer to USAGE.md. To configure for RDS Logs:
-
Add the minimal Terraform configuration, which includes the required fields:
module "honeycomb-aws-rds-logs-integration" {
source = "honeycombio/integrations/aws//modules/rds-logs"
name = "rds-logs-integration"
db_engine = "mysql"
db_name = "mysql-db-name"
db_log_types = ["slowquery"]
honeycomb_api_key = var.honeycomb_api_key // Your Honeycomb team's API key
honeycomb_dataset_name = "rds-mysql-logs"
s3_failure_bucket_arn = var.s3_bucket_arn // The full ARN of the bucket storing Kinesis Firehose failure logs.
}
-
Set the
TF_VAR_HONEYCOMB_API_KEY environment variable to your team’s Honeycomb API Key.
export TF_VAR_HONEYCOMB_API_KEY=$HONEYCOMB_API_KEY
-
Set the environment variables with your AWS credentials.
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION
For more details and options, visit Terraform documentation.
-
Run
terraform plan/apply in sequence.
For more configuration options, refer to USAGE.md. To configure for Amazon S3 Logs from a bucket:
-
Add the minimal Terraform configuration, which includes the required fields:
module "logs_from_a_bucket_integrations" {
source = "honeycombio/integrations/aws//modules/s3-logfile"
name = var.logs_integration_name
parser_type = var.parser_type // valid types are alb, elb, cloudfront, vpc-flow-log, s3-access, json, and keyval
s3_bucket_arn = var.s3_bucket_arn // The full ARN of the bucket storing the logs.
honeycomb_api_key = var.honeycomb_api_key // Your Honeycomb team's API key.
honeycomb_dataset_name = "alb-logs" // Your Honeycomb dataset name that will receive the metrics.
}
-
Set the
TF_VAR_HONEYCOMB_API_KEY environment variable to your team’s Honeycomb API Key.
export TF_VAR_HONEYCOMB_API_KEY=$HONEYCOMB_API_KEY
-
Set the environment variables with your AWS credentials.
export AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY
export AWS_DEFAULT_REGION=$AWS_DEFAULT_REGION
For more details and options, visit Terraform documentation.
-
Run
terraform plan/apply in sequence.
For more configuration options, refer to USAGE.md.
Troubleshooting
Kinesis Firehose
Kinesis Firehose sends failed data to the configured S3 bucket.
Depending on the stage of the flow when the error occurred, the error log ends up in a different subdirectory.
For example, if our HTTP endpoint sends a non-2xx error code back, the error log will be located in the http-endpoint-failed directory with the exact status code and error message.
RDS
A pre-requisite to using both the RDS logs Terraform module and the RDS logs CloudFormation stack is enabling RDS log exports to CloudWatch.
Refer to Publishing database logs to Amazon CloudWatch for instructions.
Once enabled, the module can scoop logs from those CloudWatch log groups and begin streaming them to Honeycomb.
When enabling log exports for RDS Postgresql, ensure the log_statement attribute on the parameter group is not set to ALL.
Currently, structured logs are supported for the following:
- MySQL general logs
- MySQL slow query logs
- MySQL error logs
- Postgresql slow query logs
For any RDS log types not listed above, the integration will still deliver them to Honeycomb but they will be unstructured logs.