Stream logs to a destination
DataStream 2 lets you stream logs to third-party destinations for storage and analysis. Currently available destinations include Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, and custom HTTPS endpoints.
Destinations supported by DataStream 2 offer different features. These features include custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), and mTLS authentication (for improved security).
All destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files. If you're looking for sample logs (JSON and structured) with data, see Log format.
Before you choose the destination to upload your logs, you can consult the table below for features available for each destination:
|Destination||JSON logs||Structured logs||Custom header||IP Access List||Custom log file prefix & suffix||mTLS authentication||Dynamic variables|
|custom HTTPS endpoint||✓||✓||✓||✓||✗||✓||✗|
|Google Cloud Storage||✓||✓||✓||✗||✗||✗||✓|
Set your stream to push log data to a destination of your choice. You can also specify the names for the pushed files and how often to deliver them.
Third-party destination issues
In the event of issues related to third-party destinations, such as latency or connection problems, data is lost after three failed retries to connect in 90 seconds. You can set up alerts to get email notifications about upload failures before the data is lost.
You should take these limitations into account before using data served on your stream for audit, compliance and billing purposes.
In the Delivery tab, select the Destination type where you want to send and store your data.
Provide details for your destination. The fields to fill in differ depending on the destination type you select.
For Amazon S3, see Stream logs to Amazon S3.
For Azure Storage, see Stream logs to Azure Storage.
For Datadog, see Stream logs to Datadog.
For Elasticsearch, see Stream logs to Elasticsearch.
For Google Cloud Storage, see Stream logs to Google Cloud Storage.
For Loggly, see Stream logs to Loggly.
For New Relic, see Stream logs to New Relic.
For Oracle Cloud, see Stream logs to Oracle Cloud.
For Splunk, see Stream logs to Splunk.
For Sumo Logic, see Stream logs to Sumo Logic.
For a custom HTTPS endpoint, see Stream logs to a custom HTTPS endpoint.
In Delivery options, configure the delivery settings:
In Filename, enter a custom prefix and suffix for the name of the log file uploaded to the destination. The system generates the rest of the name for you. This feature is available only for object-base destinations, such as Amazon S3, Azure Storage, Oracle Cloud Storage, and Google Cloud Storage.
In Push frequency, specify a time window for collecting log lines, either 30 or 60 seconds. After this time, the system bundles logs from each uploader and delivers them in a file to your destination.
Click Next to continue to the Summary tab.
You're almost done. You can now review the information you provided and activate your stream. See Review and activate a stream.
Updated 3 months ago