Stream logs to Datadog

DataStream 2 supports sending logs to Datadog. Datadog is a cloud-based monitoring and analytics solution that allows you to see inside applications within your stack and aggregate the results.

DataStream 2 uploads logs to Datadog over HTTP(S) endpoints. Depending on your choice, it can stream compressed or uncompressed log files.

For security reasons, DataStream 2 sends logs over TLS even if Datadog policy allows insecure requests.

Before you begin

To use Datadog as a destination for your logs, you need to:

  • Register a Datadog account. The location from where you register your Datadog account, either in the United States (US) or the European Union (EU), affects commands and endpoints you use when configuring Datadog as destination in a stream configuration. See the Datadog site for details.

  • Generate a Datadog API key dedicated to a stream. See API keys for Datadog.

  • Gather static custom tags that you want to send together with the log streams: tags, source, and service. See Logs over HTTP and Tagging in Datadog.

  • Identify the HTTPS endpoint in a hosting region, such as US1, US3, or EU. See Logs in Datadog.

How to

  1. In Destination, select Datadog.

  2. In Name, enter a human readable description for the endpoint.

  3. In Endpoint, enter the Datadog v1 endpoint URL where you want to send and store logs. Currently, DataStream does not support Datadog v2 endpoints.

    Examples: http-intake.logs.datadoghq.com/v1/input or http-intake.logs.datadoghq.eu/v1/input.

  4. Optional: In Tags, enter a comma-delimited list of tags that you use to filter and group your metrics in your Datadog account. This field also supports <key>:<value>combinations, for example, type:datastream2 for single tag, or env:staging,type:datastream2 for multiple tags. Make sure to avoid spaces between key-value tag pairs.

  5. Optional: In Source, enter the source name from which logs originate associated with your Datadog account.

    The system sets Akamai as a default source of logs.

  6. Optional: In Service, enter the name of the application or service generating the log events associated with you Datadog account.

    See the Services list in Datadog.

  7. In API key, enter the API key associated with your Datadog account.

  8. If you want to send compressed gzip logs to this destination, check Send compressed data.

  9. Click Validate & Save to validate the connection to the destination and save the details you provided.

    As part of this validation process, the system pushes a sample POST request to the endpoint to validate write access. In the log file, the data appears in the validate connector test format. You can see the data only if the destination validates, and you can access the destination storage.

Akamaized hostname as endpoint

This destination supports using Akamaized hostnames as endpoints to send DataStream 2 logs for improved security. When you create a property with a Datadog endpoint URL as hostname, this property acts as a proxy between the destination and DataStream. As a result, you can filter incoming traffic to your destination endpoint by IP addresses using the Origin IP Access List behavior. That means only IP addresses that belong to your Akamaized property hostname can send logs to your custom destination. Using Akamaized hostnames as endpoints also requires enabling the Allow POST behavior in your property.

Once the property hostname works as a destination endpoint, you cannot monitor it as a property in this or another stream. If you already monitor a property in DataStream, you cannot use it as a destination endpoint.

To enable this feature:

  1. Go to Property Manager and create a new property. We recommend choosing API Acceleration as the product. See Create a brand new property.

  2. Set your Datadog endpoint URL as the property hostname. See Redirect users to edge servers.

  3. Go to > CDN > Properties or just enter Properties in the search box.

    The Property Groups page opens.

  4. Click the Property Name link to go to the property you created.

  5. Activate the property on the production network. Only properties active on the production network can serve as DataStream destinations. See Activate property on production.

  6. On the Property Details page, click the Version of your configuration that you want to access in Manage Versions and Activations.

    The Property Manager Editor appears.

  7. In the default rule, click Add Behavior, and select Origin IP Access List. Click Insert Behavior.

    The Origin IP Access List behavior appears in the default rule.

  8. Set the Enable slider in the Origin IP Access Control List behavior to On. Click Save.

  9. Click Add behavior, and select Allow POST.

  10. Click Insert Behavior.

    The Allow POST behavior appears in the default rule.

  11. Set the Behavior option in the Allow POST behavior to Allow.

  12. Click Save.

📘

Tip

You might need to additionally configure your property to ensure uninterrupted data flow. See Configuration best practices in the Property Manager guide for other behaviors you can configure in your property.

  1. Configure the firewall settings at your destination endpoint to allow access for IP addresses that belong to CIDR blocks for your Akamaized hostname. See the Origin IP Access List behavior for the list of IP addresses to put on the allow list.

After successfully configuring an Akamaized hostname as the destination endpoint, avoid editing an active property’s setup in Property Manager to ensure uninterrupted data flow. Adding, deleting, and editing hostnames and behaviors may cause unexpected behavior in the DataStream application.

We recommend setting up alerts that send e-mail notifications every time DataStream logs cannot be uploaded to your destination, so you can immediately troubleshoot issues with your property or destination configuration. See Set up alerts.