Supported features

DataStream comes with features that make streaming data logs with low latency safe and convenient. Take a look at this list of selected features:

Alerts and notifications

Integrate DataStream with the Alerts application get e-mail notifications and resolve issues when log files can't be uploaded to your destinations for several reasons, such as invalid destination settings or timeouts, to ensure uninterrupted data flow.

When creating or editing a stream, you can provide e-mail addresses to get notifications about actions involving the stream, such as stream activation or deactivation, editing an active or inactive stream, or saving a stream version.

ChinaCDN visibility

DataStream 2 now supports log delivery for customer traffic served in China. Customers that use ChinaCDN for content delivery have the same visibility for transactions in China that DataStream 2 already provides for the rest of the world.

Custom headers

Configure DataStream to pass custom headers for the destination where you want to stream logs if it accepts only particular headers, you'd like to use additional authentication, or label the log files.

You can use custom headers for Azure Storage, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Splunk, Sumo Logic, and custom HTTPS endpoints.

Customizable data sets

In DataStream, you have control over what types of data on property traffic you can collect. You can log fields with data on the request-response cycle, including breadcrumbs, cache, geolocation, request header and message exchange data, gather information on network performance, and web security.

Other fields include midgress data on traffic within the ​Akamai​ platform, and custom log fields that let you specify additional data sets you may want to collect. See Data set parameters for the complete list of fields you can log in DataStream, and Log custom parameters for the custom log field feature.

Dynamic variables

Dynamic variables in log filenames and upload paths let you enhance control over DataStream log files. Timestamp your logs or put stream IDs in destination folder paths and filenames to conveniently sort the logs, and mix dynamic and static values (such as filename prefixes and suffixes) to label the files uploaded by DataStream to the location of your choice.

You can leverage dynamic values for Amazon S3, Azure Storage, Google Cloud Storage, and Oracle Cloud. See Dynamic variables for details.

IP access lists

DataStream supports filtering incoming traffic to your ​Akamai​ property by IP access lists (IP ACLs).

Additionally, you can use Akamaized hostnames as endpoints to send DataStream 2 logs for improved security, because properties with these hostnames act as a proxy between the destination and DataStream.

You can use Akamaized hostnames as endpoints for Datadog, Elasticsearch, Loggly, New Relic, Splunk, Sumo Logic, and custom HTTPS endpoints.

JSON and structured logs

Depending on the destination, DataStream 2 can send logs either in the Structured (space-delimited) or JSON log file format. See Log format for sample log lines and details about available log formats.

mTLS authentication

For Google Cloud Storage, Splunk and custom HTTPS destinations, you can upload a client certificate and enable mTLS authentication to improve stream security and prevent data delivery failures.

Multiple streams per property

Configure multiple streams for the same Akamai property, and use all of them to gather and send data at the same time. You can choose different destinations to send logs, and different data sets to log for each stream on your property, or use other streams as backup in case of upload failure to prevent data loss. See Enable the DataStream behavior for steps to configure multiple streams for one property.

Product integration

Leverage log collection with DataStream for a wide range of products:

Adaptive Media Delivery, API Acceleration, ChinaCDN, Cloud Wrapper (MultiCDN), Download Delivery, Dynamic Site Accelerator, Dynamic Site Delivery, Ion (Standard, Premier and Media Advanced, including Terra Alta Enterprise), Rich Media Accelerator, Object Delivery, and Web Application Accelerator.

DataStream also collects specific data for other products, such as EdgeWorkers.

Stream management

The DataStream application lets you easily manage data collection on the ​Akamai​ platform. You can use it to conveniently create, edit, clone, activate and deactivate streams that monitor your property. The intuitive DataStream dashboard displays the status and condition of the streams, lets you track any changes, and troubleshoot issues with your streams.

Third-party destinations

Stream logs to a third-party destination for storage, analytics, and enhanced control over your data. You can configure DataStream to push logs logs up to every 30 seconds to Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, or custom HTTPS endpoints.

Destinations support additional features, including IP access lists, custom headers, dynamic upload paths and filenames, and mTLS authentication. Check Stream logs to a destination for details.