How to Migrate | Honeycomb

We use cookies or similar technologies to personalize your online experience & tailor marketing to you. Many of our product features require cookies to function properly.

Read our privacy policy I accept cookies from this site

How to Migrate

A self-service migration from Classic Datasets to Environments is available to all teams. When migrating, data being sent to Honeycomb will change its destination from the dataset(s) in Classic to the new Environment(s).

For Enterprise teams, migration assistance exists. Contact your Honeycomb Success Representative for more details.

Self-Service Migration Steps 

Requirements 

Before proceeding, complete the Migration Preparation checklist, and ensure an easier migration experience.

All Secure Tenancy users default to the Honeycomb Classic experience, and Secure Tenancy is not available for Environments and Services.

Overview 

The migration process includes the following tasks:

  1. Create a New Environment
  2. Update instrumentation to Environment and Services-compatible versions
  3. Migrate Refinery (if applicable)
  4. Send your Honeycomb data to an Environment
  5. Recreate your Honeycomb Feature Configurations in your Environment

Post-Migration:

  1. Clean up Configurations
  2. Conclude Classic Usage

Create A New Environment 

First, create a new Environment in Honeycomb.

You must be a team owner in order to create an Environment. A new Environment creates a new API Key by default. After creating a new Environment, additional API Keys may be created according to best practices. In a later step, use this API Key to update your instrumentation and to tell Honeycomb to send the data to this new Environment.

To create a new Environment in Honeycomb:

  1. Select the label below the Honeycomb logo in the left navigation menu to reveal the Environments list. When working within Honeycomb Classic, a Classic label with a gray background appears.

    The environment selector selected and showing the list of Classic and other Environments (screenshot)

  2. Select Manage Environments. The Environments summary appears.

  3. Select Create Environment in the top right corner. A modal appears.

  4. Enter a name (required) for the environment. Optionally, enter a description for the Environment and choose a representative color from the dropdown.

    Note: Environment Names can not be renamed. Environments can be deleted. Update an Environment’s color and description in Environment Settings.

  5. Select Create Environment and the new Environment will appear in the Environments summary.

  6. Select View API Keys in your new Environment’s row. The Environment’s API Keys appears in list form.

  7. Use the Copy icon to copy the API Key for use in your instrumentation.

An Environment-wide API Key must have send events and create datasets permissions to send events and create new datasets from traces.

Update Instrumentation Version 

At this point of the migration, ensure each tracing instrumentation library is updated to use the minimum version, or ideally the latest version, that supports Environments and Services. We recommend updating to the latest version to enjoy a library’s full benefits and features. Reference the instrumentation version updates list from your migration preparation.

Refinery 

If you use Refinery, first complete the steps in the Refinery Migration section before continuing.

Otherwise, proceed to the next step.

Send Your Honeycomb Data to an Environment 

In your instrumentation, update each Honeycomb API Keys with an API Key from your new Environment. Changing your API Keys causes data flow into your new Environment and the automatic creation of new datasets.

Any reference to the Honeycomb API needs an updated API key.

Trace data is linked to an Environment and is identified implicitly by the API key used. For Service datasets, specifying a Dataset name is no longer required to submit trace data.

Any General dataset still requires a Dataset name to submit data. General datasets includes logs and metrics and should be previously identified in your migration preparation.

After changing your API Keys, Honeycomb should show:

  • data appearing in the new Environment
  • new Service datasets, based on service.name
  • new General datasets, as named in the existing instrumentation
  • Your Classic Environment no longer receiving data

Validate New Datasets in Environment 

Reference the Dataset and Services lists from your migration preparation to ensure all expected Datasets are created.

Use Service Map to determine if Services appear as expected in their new Environment.

Recreate your Honeycomb Configurations in your Environment 

Recreate the configuration for each Honeycomb feature by using the Honeycomb UI or the appropriate feature’s API.

Reference the Honeycomb Features and Configurations list from your migration preparation.

We recommend recreating each Honeycomb feature in the following order:

  1. Attributes, or fields
  2. Derived Columns
  3. Query Annotations
  4. Boards
  5. Triggers
  6. SLOs
  7. Marker Configurations
  8. Dataset Definitions

After Migration 

Clean up Configuration 

After migration, review your Environment and configurations for any errors.

For new Honeycomb configurations that are service-specific, remove any unnecessary service name references (service.name).

Conclude Classic Usage 

If no longer needed, delete your Classic Environment using Delete Environments in Environment Settings. You must be a team owner to delete an environment.

You may want to keep your Classic Environment if:

  • Your Classic permalinks are still relevant
  • You still have data in Classic that you want to reference

Note that this Classic data will age out based on your retention period. We encourage the deletion of your Classic Environment once finished with it.

Questions and Support 

Join the #discuss-hny-classic channel in our Pollinators Community Slack to ask questions and learn more.

For Pro and Enterprise users, contact Support via email at support@honeycomb.io.

Refinery Migration Guide 

To migrate Refinery from Classic to Environments:

  1. Update Refinery instrumentation to a version that supports Environments
  2. Update Refinery Sampling Rules configuration in rules.toml to support Environments as needed
  3. Update Refinery General Configuration in config.toml to use the new Environment API Key(s) and Environment name(s)
  4. [Optional] Add EnvironmentCacheTTL, an optional Refinery configuration option, to config.toml

Update Refinery Instrumentation Version 

Update your Refinery instrumentation to a version that supports Environments and Services.

We recommend updating to the latest available version, but the minimum required version for Refinery is version 1.12.0.

Refinery Sampling Rules Configuration 

As needed, update your Refinery Sampling Rules in rules.toml to include Environment Sampling Rules.

Environment names are case-sensitive in your Refinery configuration.

Consider updating if your existing Sampling Rules needs to specify sampling based on service name. In a situation where the Dataset name matches the new Environment name, an update may not be needed.

Good news! Refinery Sampling Rules can coexist with the same name for Environments and Classic Datasets. Otherwise, Refinery migration will require running two Refinery clusters, which is not ideal. Use coexisting Sampling rules to route yet-to-be-migrated data to Honeycomb Classic and to send migrated services to Environments.

To enable Classic Dataset and Environment coexistence, set a DatasetPrefix in the config.toml configuration. When Refinery receives telemetry using a Classic Dataset API key, it uses the DatasetPrefix to resolve rules using the format {prefix}.{dataset}.

  1. Set the dataset prefix (DatasetPrefix) in config.toml. For example:

    DatasetPrefix = "classic"
    
  2. Update your Classic datasets in rules.toml with the DatasetPrefix value. For example, these sampling rules define the Environment “Hello world”, the Environment “Production”, and a “production” dataset in Honeycomb Classic. Note that the “production” Honeycomb Classic Dataset is configured as [classic.production].

    # default rules
    Sampler = "DeterministicSampler"
    SampleRate = 1
    
    ['Hello world'] # environment, wrap with '' if name contains white space
    Sampler = "DeterministicSampler"
    SampleRate = 10
    
    [production] # environment
    Sampler = "DeterministicSampler"
    SampleRate = 20
    
    [classic.production] # classic dataset
    Sampler = "DeterministicSampler"
    SampleRate = 30
    

Refinery General Configuration 

With Environments, Refinery now supports the ability to determine if an API key is a Classic Dataset key (32 characters) or an Environment Key (22/23 characters) when receiving telemetry data.

For Classic Dataset keys, Refinery follows the pre-existing behavior and uses the Dataset present in the event to reference the sampler definition.

For Environment keys, Refinery uses the API key to call to Honeycomb API and retrieve the Environment name, which is cached. Then, Refinery uses that value to look up the sampler definition. (Change how long Refinery caches the Environment name value with EnvironmentCacheTTL.)

Evaluate and update your Refinery’s General configuration with:

  1. the new Environment API Key(s) for APIKeys if not accepting all API Keys
  2. DatasetPrefix if an Environment and a Classic Dataset use the same name
  3. (optional) EnvironmentCacheTTL if change is needed

EnvironmentCacheTTL 

EnvironmentCacheTTL is a new optional Refinery configuration option that exists for Environments users.

When given an Environment API key, Refinery looks up the associated environment through an HTTP call home to Honeycomb. The EnvironmentCacheTTL configuration controls the amount of time that the retrieved environment name is cached. The default is 1 hour (“1h”) and is not eligible for live reload.

To cache for a different length of time than the default 1 hour, set the EnvironmentCacheTTL in config.toml:

EnvironmentCacheTTL = "2h"

Refinery Migration Complete 

At this point, your Refinery instance should have:

Once your Refinery update is complete, send your Honeycomb data to an Environment.

You may find that the Refinery Rules configuration need adjustment after data is sent to an Environment. Modify the rules as necessary.

Did you find what you were looking for?