In DataStream 2, you can stream logs to third-party destinations for storage, analytics, and enhanced control over your data.
Currently available destinations include Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, S3-compatible destinations, and custom HTTPS endpoints.
When choosing a destination to stream logs, you can configure features that differ between available options. This includes custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), dynamic variables (to sort and label logs), and mTLS authentication (for improved security).
While all destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.
Before choosing the destination to upload your logs, you can consult the table below for features available for each:
|Destination||JSON logs||Structured logs||Custom header||IP Access List||Custom log file prefix & suffix||mTLS authentication||Dynamic variables|
|custom HTTPS endpoint||✓||✓||✓||✓||✗||✓||✗|
|Google Cloud Storage||✓||✓||✓||✗||✗||✗||✓|
Third-party destination issues
In case of issues with third-party destinations, such as latency or connection problems, the data is lost after three failed retries to connect in 90 seconds.
You can Set up alerts to get email notifications about upload failures before the data is lost.
You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes.
Set your stream to push log data to a destination of your choice. You can also specify the names for the pushed files and how often to deliver them:
In the Delivery tab, select the Destination where you want to send and store your data.
Enter the details for your destination, such as the Display name and endpoint URL. The fields you need to fill differ between destinations:
For Amazon S3, see Stream logs to Amazon S3.
For Azure Storage, see Stream logs to Azure Storage.
For Datadog, see Stream logs to Datadog.
For Elasticsearch, see Stream logs to Elasticsearch.
For Google Cloud Storage, see Stream logs to Google Cloud Storage.
For Loggly, see Stream logs to Loggly.
For New Relic, see Stream logs to New Relic.
For Oracle Cloud, see Stream logs to Oracle Cloud.
For S3-compatible destinations, see Stream logs to an S3-compatible destination
For Splunk, see Stream logs to Splunk.
For Sumo Logic, see Stream logs to Sumo Logic.
For a custom HTTPS endpoint, see Stream logs to a custom HTTPS endpoint.
In Delivery options, configure the delivery settings:
In Filename, enter a custom prefix and suffix for the name of the log file uploaded to the destination.
For object-based destinations such as Amazon S3, Azure Storage, Oracle Cloud Storage, Google Cloud Storage, and S3-compatible endpoints, you can use Dynamic variables.
In Push frequency, specify how often DataStream should bundle and push logs to your destination, either every 30 or 60 seconds.
Click Next to continue to the Summary tab.
You're almost done. You can now review the information you entered and activate your stream. See Review and activate a stream.
Updated 27 days ago