Before configuring DataStream 2, get to know some basic concepts from the DataStream 2 API that we refer to throughout this guide:

  • Groups. Each account features a hierarchy of groups, which control access to properties. Using either Control Center or the Identity and Access Management API, account administrators can assign properties to specific groups, each with its own set of users and accompanying roles. Your access to any given property depends on the role set for you in its group. Typically, you need a group identifier to create a stream configuration.

  • Contracts. Each account features one or more contracts, each of which has a fixed term of service during which specified Akamai products and modules are active. Typically, you need a contract identifier to create a stream configuration. Get and store the contract ID from your account, for example, using the Contracts API.

  • Products. Each contract enables one or more products, each of which allows you to deploy web properties on the Akamai edge network and receive support from Akamai Professional Services. Products allow you to create new properties, CP codes, and edge hostnames. They also determine the baseline set of a property's rule behaviors. Typically, you need to know a product name to create a stream configuration. Get and store the name from your account, for example, using the Contracts API. For the list of supported products, see Product integration.

  • Properties. A property, also referred to as a delivery configuration, lets you control how edge servers respond to various kinds of requests to your assets. You can manage them from Property Manager or the Property Manager API. Properties apply rules to a set of hostnames, and you can only apply one property at a time to any given hostname. Each property is assigned to a product, which determines which rule behaviors you can use. Streams let you monitor the traffic served by a property. Also, you can associate a property with a single stream.

  • Destination. A destination, also known as a connector in DataStream 1, represents a third-party configuration where you can send your stream's log files. DataStream 2 supports Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, S3-based destinations and custom HTTPS endpoints as destinations. See Destinations for more details and features available for each destination.

  • Data set. A metric, parameter, or value collected in the stream, represented in the JSON log format as a separate log line with a key, for example, "statusCode": "206", and as a space-delimited value in Structured logs. You can choose from fields related to the request-response cycle, including breadcrumbs, cache, geolocation, request header and message exchange data, network performance, and web security. See Data set parameters for the complete list of fields available in DataStream.

  • Streams. They collect, bundle, and send raw log records to a chosen destination at selected time windows. It lets you control the data set fields you monitor in your logs, the order of these fields' log lines, and the delivery time frames. You can update a stream through versioning any time you want to change the properties it monitors or the data set fields it collects. A stream can monitor up to three properties that aren't part of any other streams. Note that RAW_LOGS is the only stream type currently available.

  • Activation history. You can activate and deactivate the latest version of a stream at any time. Activation history lets you track changes in activation status across all versions of a stream.