Stream logs to a destination
In DataStream 2, you can stream logs to third-party destinations for storage, analytics, and enhanced control over your data.
Currently available destinations include Amazon S3, Azure Storage, Datadog, Dynatrace,Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, S3-compatible, TrafficPeak, and custom HTTPS endpoints.
Destination features
When choosing a destination to stream logs, you can configure features that differ between available options. This includes custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), dynamic variables (to sort and label logs), and mTLS authentication (for improved security).
While all destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.
Before choosing the destination to upload your logs, you can consult the table below for features available for each:
Destination | JSON logs | Structured logs | Custom header | IP Access List | Custom log file prefix & suffix | mTLS authentication | Dynamic variables |
---|---|---|---|---|---|---|---|
Amazon S3 | ✓ | ✓ | ✗ | ✗ | ✓ | ✗ | ✓ |
Azure Storage | ✓ | ✓ | ✓ | ✗ | ✓ | ✗ | ✓ |
custom HTTPS endpoint | ✓ | ✓ | ✓ | ✓ | ✗ | ✓ | ✗ |
Datadog | ✓ | ✓ | ✗ | ✓ | ✗ | ✗ | ✗ |
Dynatrace | ✓ | ✗ | ✓ | ✗ | ✗ | ✗ | ✗ |
Elasticsearch | ✓ | ✗ | ✓ | ✓ | ✗ | ✗ | ✗ |
Google Cloud Storage | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ | ✓ |
Loggly | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ |
New Relic | ✓ | ✗ | ✓ | ✓ | ✗ | ✗ | ✗ |
Oracle Cloud | ✓ | ✓ | ✗ | ✗ | ✓ | ✗ | ✓ |
S3-compatible destinations | ✓ | ✓ | ✗ | ✗ | ✓ | ✗ | ✓ |
Splunk | ✓ | ✓ | ✓ | ✓ | ✗ | ✓ | ✗ |
Sumo Logic | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ |
TrafficPeak | ✓ | ✗ | ✓ | ✗ | ✗ | ✗ | ✗ |
Third-party destination issues
Streaming logs to third-party destinations may temporarily fail due to issues such as latency or connection problems. To prevent data loss during upload failures, DataStream retries data upload up to 10 attempts in 5 minutes for 429 and 5XX error codes. For details, see Delivery retry.
You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes.
How to
Set your stream to push log data to a destination of your choice. You can also specify the names for the pushed files and how often to deliver them:
-
In the Delivery tab, select the Destination where you want to send and store your data.
-
Enter the details for your destination, such as the Display name and endpoint URL. The fields you need to fill differ between destinations:
-
For Amazon S3, see Stream logs to Amazon S3.
-
For Azure Storage, see Stream logs to Azure Storage.
-
For Datadog, see Stream logs to Datadog.
-
For Dynatrace, see Stream logs to Dynatrace.
-
For Elasticsearch, see Stream logs to Elasticsearch.
-
For Google Cloud Storage, see Stream logs to Google Cloud Storage.
-
For Loggly, see Stream logs to Loggly.
-
For New Relic, see Stream logs to New Relic.
-
For Oracle Cloud, see Stream logs to Oracle Cloud.
-
For S3-compatible destinations, see Stream logs to an S3-compatible destination
-
For Splunk, see Stream logs to Splunk.
-
For Sumo Logic, see Stream logs to Sumo Logic.
-
For a custom HTTPS endpoint, see Stream logs to a custom HTTPS endpoint.
-
For TrafficPeak integration using custom HTTPS endpoints, see Stream logs to TrafficPeak.
-
-
In Delivery options, configure the delivery settings:
-
In Filename, enter a custom prefix and suffix for the name of the log file uploaded to the destination.
For object-based destinations such as Amazon S3, Azure Storage, Oracle Cloud Storage, Google Cloud Storage, and S3-compatible endpoints, you can use Dynamic variables. -
In Push frequency, specify how often DataStream should bundle and push logs to your destination, either every 30 or 60 seconds.
-
-
Click Next to continue to the Summary tab.
Next steps
You're almost done. You can now review the information you entered and activate your stream. See Review and activate a stream.
Updated 16 days ago