In DataStream 2, you can stream logs to third-party destinations for storage, analytics, and enhanced control over your data.

Currently available destinations include Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, S3-compatible destinations, and custom HTTPS endpoints.

Destination features

When choosing a destination to stream logs, you can configure features that differ between available options. This includes custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), dynamic variables (to sort and label logs), and mTLS authentication (for improved security).

While all destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.

Before choosing the destination to upload your logs, you can consult the table below for features available for each:

DestinationJSON logsStructured logsCustom headerIP Access ListCustom log file prefix & suffixmTLS authenticationDynamic variables
Amazon S3
Azure Storage
custom HTTPS endpoint
Google Cloud Storage
New Relic
Oracle Cloud
S3-compatible destinations
Sumo Logic


Third-party destination issues

In case of issues with third-party destinations, such as latency or connection problems, the data is lost after three failed retries to connect in 90 seconds.
You can set up alerts to get email notifications about upload failures before the data is lost.

You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes.