Configuring Pipelines

What Is a Pipeline?

In MetaRouter, Pipelines are how you:

  • Segregate your event sources from one another
  • Apply event transformations according to the correct integration settings
  • Ensure events are ultimately sent to the correct integration accounts.

Each pipeline consists of one source, which is represented by that pipeline's writeKey. All events that include a given writeKey will be included in the same pipeline and sent to the integrations that are attached to a given pipeline.

Building Pipelines

In order to build a pipeline, it must:

  • Have a writeKey and at least one properly configured Integration
  • Be assigned to a Cluster

Deploying Pipelines

Once a pipeline includes a writeKey, Cluster, and properly configured Integration, it can be deployed. The pipeline health indicator shows whether your pipeline is ready to be deployed or needs additional attention first.

  • Good: The pipeline and its integrations are correctly configured
  • Resolve: The pipeline is connected to integrations with missing pipeline variables

Upon deployment, you will either receive a success message, or an error message with additional detail. Potential errors during a pipeline deployment include:

  • Required parameter without a value
  • Selected Integration is not available on the Cluster
  • Unable to connect to the Cluster (incorrect URL or secret

Keep in mind that a successful pipeline deployment does not guarantee proper integration functionality. In order to determine whether your integrations are healthy, check your event logs (if possible) or reach out to the MetaRouter team for troubleshooting tips.

Configuring Multiple Pipelines

Any organization using the MetaRouter platform is likely to configure multiple pipelines. Using separate pipelines ensures that data produced from one source is not mixed with data from a completely different source, which could use a different data schema or that has different desired integrations. Let's use an imaginary site, Bob's Fly Fishing website, as an example for this.

Bob is fairly technically savvy and knows he'd like to test changes to his website before they're applied in a production environment where his customers can see the changes. He has a development environment, staging environment, and production environment. With MetaRouter, he can ensure that each environment sends events to different analytics properties so that he can ensure no breaking changes are made. In this example, he would name his environments:

  • dev_website
  • staging_website
  • prod_website

Bob also has an iOS app where his users can buy fishing gear. Mobile events are generally siloed into their own properties and contain unique interaction information, so he wants to keep these events separate from his website's events. In addition, he uses an analytics tool dedicated to mobile apps. He sets up the following pipelines:

  • dev_ios
  • staging_ios
  • prod_ios

Bob now has six pipelines and knows exactly where each event source's data is being sent to. In addition, by viewing each pipeline, he can easily see what integrations are configured for his mobile and web properties. While installing the Analytics.js source code, Bob just needs to make sure he is using the correct writeKeys in each implementation. Once that is done, his events will be sent to the proper pipelines.

Pipeline Variables

If you only have a few environments where you're producing events, you can easily edit each integration configuration one-by-one in a short amount of time. However, some users have many different properties that utilize similar tracking schemas and integrations. This is where Pipeline Variables are extremely helpful.

Let's revisit Bob's scenario. Bob's website is bought by a large publisher, The Fishing Times, which also owns dozens of fishing-related websites. The Fishing Times also wants to use MetaRouter, and is primarily concerned with consolidating the tracking ecosystem on their websites. Each website uses the same data schema, produces the same events, and sends events to the same analytics tool. However, each websites needs to send data to their own property within the analytics tool. This is accomplished by entering each property's unique API key into the tool's MetaRouter mapping.

Instead of creating a new integration for each property, The Fishing Times can designate the API key field as a Pipeline Variable. Once the integration is saved, The Fishing Times can view a list of each pipeline that an integration is applied for, and paste the correct API key without ever needing to create another integration for the same tool.

writeKey Naming Best Practices

As stated earlier, your writeKey is a unique key that represents your pipeline, and is how that pipeline is referenced when configuring your source code. It is important that your writeKey is easily understood across your organization, so others do not become confused when maintaining or making changes to your MetaRouter implementation. Here are two questions that should be quickly answered by the writeKey name:

  • What source are the events being generated from?
  • Is the source a production, staging, development, etc. source?

Common examples of correctly named writeKeys include:

  • prod_web
  • staging_ios
  • dev_java

Your organization may require additional writeKey best practices if you implement many sources. For more help on writeKey best practices regarding your specific situation, please contact your account team.