Supported features

DataStream 2 comes with features you can use to make streaming low-latency log data efficient, safe, and convenient.

Take a look at the list below:

Alerts and notifications

Integrate DataStream with the Alerts application to get e-mail notifications, and resolve issues to ensure uninterrupted data flow when log files can't be uploaded to your destination. This may happen for several reasons, such as invalid destination settings, connection issues or timeouts.

When creating or editing a stream, you can provide e-mail addresses to get notifications about actions involving the stream, such as stream activation or deactivation, editing an active or inactive stream, or saving a stream version.

For details, see Set up alerts.

ChinaCDN visibility

If you use ChinaCDN for content delivery, you can now benefit from the same log delivery and visibility for customer traffic served in China that DataStream 2 already provides for the rest of the world.

Custom headers

You can configure DataStream to send custom request headers for the destination where you want to stream logs. Use them if the destination accepts only particular types of headers, you'd like label the log files at the destination, or configure additional authentication for streaming logs to your destination.

You can use custom headers for Azure Storage, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Splunk, Sumo Logic, TrafficPeak, and custom HTTPS endpoints.

Customizable data sets

In DataStream, you have control over what types of data on property traffic you can gather.

You can log fields with details on the request-response cycle, including breadcrumbs, caching, geolocation, request header, and message exchange data. Select data sets from dedicated groups to collect information on network performance, media streaming, and web security.

Other fields include midgress data on traffic within the ‚ÄčAkamai‚Äč platform, custom log fields that you can specify to get additional data sets, and Reporting metrics and dimensions.

For the complete list of fields you can log in DataStream, see Data set parameters. For details about the custom log field feature, see Log custom parameters.

Dynamic variables

Enhance your control over DataStream log data with dynamic variables in destination folder paths and log filenames. You can timestamp your logs or put stream IDs in paths and log names to conveniently sort the log files, and mix dynamic and static values (such as filename prefixes and suffixes) to label the files uploaded by DataStream to the location of your choice.

You can leverage dynamic values for Amazon S3, Azure Storage, Elasticsearch, Google Cloud Storage, Oracle Cloud, and S3-compatible destinations.

For details, see Dynamic variables.

IP access lists

DataStream supports filtering incoming traffic to your ‚ÄčAkamai‚Äč property by IP access lists (IP ACLs).

Additionally, you can use Akamaized hostnames as endpoints to send DataStream 2 logs for improved security, as properties with these hostnames can act as a proxy between the destination and DataStream.

You can use Akamaized hostnames as endpoints for Datadog, Elasticsearch, Loggly, New Relic, Splunk, Sumo Logic, and custom HTTPS endpoints.

JSON and structured logs

Depending on the destination you choose to send logs to, DataStream 2 can uploadd raw log data either in the Structured (space- or tab-delimited), or JSON log file format you can conveniently ingest for further analysis. Using the DataStream 2 application, you can change the order of data set fields in a log line.

For sample log lines and details about available log formats, see Log format.

Log data localization

Log data localization allows collecting, processing, archiving, and delivering edge traffic logs on ‚ÄčAkamai‚Äč's infrastructure distributed in the European Union (EU) to meet your personal data processing needs. It's supported by default for delivery services such as DataStream, where edge traffic logs collected in the EU are handled in the EU without enabling or configuring additional products on your account.

For details, see Log Data Localization (EU).

mTLS authentication

When configuring a destination to send DataStream logs to, you can upload a client certificate and enable mTLS authentication to improve stream security and prevent data delivery failures. This feature is available for the Splunk and custom HTTPS destinations.

Multiple streams per property

Configure multiple streams for the same Akamai property, and use all of them to gather and send data at the same time.

You can choose different destinations to send logs, and different data sets to log for each stream on your property, or use other streams as backup in case of upload failure to prevent data loss.

For steps to configure multiple streams for one property, see Enable the DataStream behavior.

Product integration

You can leverage log collection with DataStream for a wide range of products:

Adaptive Media Delivery, API Acceleration, ChinaCDN, Download Delivery, Dynamic Site Accelerator, Dynamic Site Delivery, HTTPS Downloads, Ion (Standard, Premier and Media Advanced, including Terra Alta Enterprise), Progressive Media Downloads, Rich Media Accelerator, Object Delivery, and Web Application Accelerator.

DataStream also collects specific data for other products, such as EdgeWorkers.

Reporting metrics

Log metrics and calculate dimensions you may know from Reporting in DataStream 2. You can get reports for your property including traffic, media delivery, and proxy protection data, using dedicated and already supported data set fields.

For details, see Reporting metrics and dimensions.

Stream management

You can easily monitor and manage your streams on the ‚ÄčAkamai‚Äč platform. Create, edit, clone, activate, and deactivate streams that monitor your property using the interactive DataStream dashboard. The dashboard displays the status and condition of the streams, lets you track any changes, and troubleshoot issues with your streams.

For details, see View and manage versions.

Third-party destinations

Stream logs to a third-party destination for storage, analytics, and enhanced control over your data. You can configure DataStream to push logs as often as every 30 seconds to Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, TrafficPeak, S3-compatible destinations or custom HTTPS endpoints.

Destinations support additional features, including IP access lists, custom headers, dynamic upload paths and filenames, and mTLS authentication. For details, see the table in Stream logs to a destination.