Stream logs to a destination

In DataStream 2, you can stream logs to third-party destinations for storage, analytics, and enhanced control over your data.

Currently available destinations include Amazon S3, Azure Storage, Datadog, Dynatrace , Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, TrafficPeak, S3-compatible destinations, and custom HTTPS endpoints.

Destination features

When choosing a destination to stream logs, you can configure features that differ between available options. This includes custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), dynamic variables (to sort and label logs), and mTLS authentication (for improved security).

While all destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.

Before choosing the destination to upload your logs, you can consult the table below for features available for each:

DestinationJSON logsStructured logsCustom headerIP Access ListCustom log file prefix & suffixmTLS authenticationDynamic variables
Amazon S3
Azure Storage
custom HTTPS endpoint
Datadog
Dynatrace
Elasticsearch
Google Cloud Storage
Loggly
New Relic
Oracle Cloud
S3-compatible destinations
Splunk
Sumo Logic
TrafficPeak

📘

Third-party integration

DataStream 2 is optimized for high-volume raw data delivery, and we recommend using log data for basic traffic analytics and monitoring CDN health.

In case of latency or connection problems involving third-party destinations, DataStream 2 attempts to deliver log data in up to 10 retries over 5 minutes. For details, see the Delivery retry feature. You can create backup streams or Set up alerts to prevent data loss.

You should take these limitations into account before using data served on your stream for audit, compliance, or billing purposes.

How to

Set your stream to push log data to a destination of your choice. You can also specify the names for the pushed files and how often to deliver them:

  1. In the Delivery tab, select the Destination where you want to send and store your data.

  2. Enter the details for your destination, such as the Display name and endpoint URL. The fields you need to fill differ between destinations:

  3. In Delivery options, configure the delivery settings:

    • In Filename, enter a custom prefix and suffix for the name of the log file uploaded to the destination.
      For object-based destinations such as Amazon S3, Azure Storage, Oracle Cloud Storage, Google Cloud Storage, and S3-compatible endpoints, you can use Dynamic variables.

    • In Push frequency, specify how often DataStream should bundle and push logs to your destination, either every 30 or 60 seconds.

  4. Click Next to continue to the Summary tab.

Next steps

You're almost done. You can now review the information you entered and activate your stream. See Review and activate a stream.