Stream logs to a destination

DataStream 2 lets you stream logs to third-party destinations for storage and analysis. Currently available destinations include Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, and custom HTTPS endpoints.

Destinations supported by DataStream 2 offer different features. These features include custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), and mTLS authentication (for improved security).

All destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.

Before you choose the destination to upload your logs, you can consult the table below for features available for each destination:

DestinationJSON logsStructured logsCustom headerIP Access ListCustom log file prefix & suffixmTLS authenticationDynamic variables
Amazon S3
Azure Storage
custom HTTPS endpoint
Datadog
Elasticsearch
Google Cloud Storage
Loggly
New Relic
Oracle Cloud
Splunk
Sumo Logic

Set your stream to push log data to a destination of your choice. You can also specify the names for the pushed files and how often to deliver them.

🚧

Third-party destination issues

In the event of issues related to third-party destinations, such as latency or connection problems, data is lost after three failed retries to connect in 90 seconds. You can set up alerts to get email notifications about upload failures before the data is lost.

You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes.

How to

  1. In the Delivery tab, select the Destination type where you want to send and store your data.

  2. Provide details for your destination. The fields to fill in differ depending on the destination type you select.

  3. In Delivery options, configure the delivery settings:

    • In Filename, enter a custom prefix and suffix for the name of the log file uploaded to the destination. The system generates the rest of the name for you. This feature is available only for object-base destinations, such as Amazon S3, Azure Storage, Oracle Cloud Storage, and Google Cloud Storage.

    • In Push frequency, specify a time window for collecting log lines, either 30 or 60 seconds. After this time, the system bundles logs from each uploader and delivers them in a file to your destination.

  4. Click Next to continue to the Summary tab.

Next steps

You're almost done. You can now review the information you provided and activate your stream. See Review and activate a stream.