Before configuring DataStream 2, get to know some basic concepts from the DataStream 2 API that we refer to throughout this guide:
-
Groups. Each account features a hierarchy of groups, which control access to properties. Using either Control Center or the Identity and Access Management API, account administrators can assign properties to specific groups, each with its own set of users and accompanying roles. Your access to any given property depends on the role set for you in its group. Typically, you need a group identifier to create a stream configuration.
-
Contracts. Each account features one or more contracts, each of which has a fixed term of service during which specified Akamai products and modules are active. Typically, you need a contract identifier to create a stream configuration. Get and store the contract ID from your account—for example, using Contracts API.
-
Products. Each contract enables one or more products, each of which allows you to deploy web properties on the Akamai edge network and receive support from Akamai Professional Services. Products allow you to create new properties, CP codes, and edge hostnames. They also determine the baseline set of a property's rule behaviors. Only some products enable log collection. Typically, you need to know a product name to create a stream configuration. Get and store the name from your account—for example, using Contracts API.
-
Properties. A property, also referred to as a configuration, lets you control how edge servers respond to various kinds of requests to your assets. You can manage them from Property Manager or the Property Manager API. Properties apply rules to a set of hostnames, and you can only apply one property at a time to any given hostname. Each property is assigned to a product, which determines which rule behaviors you can use. Streams let you monitor the traffic served by a property. Also, you can associate a property with a single stream.
-
Destination. A destination, also known as a connector in DataStream 1, represents a third-party configuration where you can send your stream's log files. DataStream 2 supports Amazon S3, Azure Storage, Datadog, Elasticsearch, Google Cloud Storage, Loggly, New Relic, Oracle Cloud, Splunk, Sumo Logic, S3-based destinations and custom HTTPS endpoints as destinations. See Destinations for more details and features available for each destination.
-
Streams. Collects, bundles, and streams raw request log records to a chosen destination at selected time windows. It lets you control the data set fields you monitor in your logs, the order of these fields' log lines, and the delivery time frames. You can update a stream through versioning any time you want to change the properties it monitors or the data set fields it collects. A stream can monitor up to three properties that aren't part of any other streams. Note that
RAW_LOGS
is the only stream type currently available. -
Templates. Each product provides a number of pre-defined sets of data called templates that you can monitor in a stream. You can configure data set fields for your stream to collect data in all request-response cycles at the edge and send logs to a destination. If needed, you can also add custom data to the log records. Note that
EDGE_LOGS
is the only template currently available. -
Activation history. You can activate and deactivate the latest version of a stream at any time. Activation history lets you track changes in activation status across all versions of a stream.