DataStream 2 lets you stream logs to third-party destinations for storage and analytics. Currently available destinations include Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, and custom HTTPS endpoints.
Destinations supported by DataStream 2 offer different features. These features include custom headers (you can use them for authentication and labeling incoming logs), IP Access List (filtering requests by IP or using Akamaized hostnames as endpoints), and mTLS authentication (for improved security). Click on the destination name to go to the user guide with details about features available for each destination.
All destinations support the JSON file format, Elasticsearch and New Relic do not support Structured log files.
Before you choose the destination to upload your logs, you can consult the table below for features available for each destination:
Destination | JSON logs | Structured logs | Custom header | IP Access List | mTLS authentication | Custom log file prefix & suffix | Dynamic variables |
---|---|---|---|---|---|---|---|
Amazon S3 | ✓ | ✓ | ✗ | ✗ | ✗ | ✓ | ✓ |
Azure Storage | ✓ | ✓ | ✓ | ✗ | ✗ | ✓ | ✓ |
custom HTTPS endpoint | ✓ | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ |
Datadog | ✓ | ✓ | ✗ | ✓ | ✗ | ✗ | ✗ |
Elasticsearch | ✓ | ✗ | ✓ | ✓ | ✗ | ✗ | ✗ |
Google Cloud Storage | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ | ✓ |
Loggly | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ |
New Relic | ✓ | ✗ | ✓ | ✓ | ✗ | ✗ | ✗ |
Oracle Cloud | ✓ | ✓ | ✗ | ✗ | ✗ | ✓ | ✓ |
Splunk | ✓ | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ |
Sumo Logic | ✓ | ✓ | ✓ | ✓ | ✗ | ✗ | ✗ |