Stream logs to a custom HTTPS endpoint
Follow these steps to send DataStream 2 logs to TrafficPeak, the packaged solution for Hydrolix services built on top of Akamai's Connected Cloud infrastructure.
You can use TrafficPeak to visualize data collected in your stream on a customized Grafana dashboard for improved insight into the traffic on your properties.
Before you begin
Before configuring TrafficPeak as a destination to send logs, contact the Akamai account team to collect the following configuration details:
- Ingest endpoint URL in the
https://<host>/ingest/event?table=<tablename>&token=<token>
format, including the HTTP streaming ingest token, and the table name from your Hydrolix project - Basic HTTP authentication username and password for your endpoint
Best practice
TrafficPeak logs require choosing the Request time data set field. For Edge DNS and GTM streams, it requires the Epoch timestamp field.
For streaming logs to TrafficPeak, we recommend choosing the Include all option to log all data set fields in these categories:
- Log information
- Message exchange data
- Request header data
- Network performance data
- Cache data
- Geo data
For steps on how to choose data set fields to log, see Choose data parameters. You can also check the list of all Data set parameters.
If you want to collect logs for 100 billion edge hits in a month or more, see Multiple streams for additional guidance.
How to
-
In Destination, select TrafficPeak.
-
In Name, enter a human-readable description for the destination.
-
In Endpoint URL, enter the path to the endpoint in the
https://<host>/ingest/event?table=<tablename>&token=<token>
format, wheretoken
is the HTTP streaming ingest token, andtablename
the Hydrolix data set table name. -
Check the Send compressed data box.
-
Click Additional options to add custom header details. In Content type, choose application/json.
-
Click Validate & Save to validate the connection to the destination and save the details you provided.
As part of the validation process, the system uses the credentials you enter to push a sample0,access_validation
request to the provided endpoint to validate the write access. For JSON logs, the data follows the{"access_validation":true}
format. You can see the data only if the destination validates, and you can access the destination storage. -
Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
Click Next.
Additional options
-
Optionally, click Additional options to add mTLS certificates for additional authentication. In Client certificate, enter the:
-
TLS hostname matching the Subject Alternative Names (SANs) present in the SSL certificate for the endpoint URL. If not provided, DataStream 2 fetches the hostname from the URL.
-
CA certificate that you want to use to verify the origin server's certificate. DataStream requires a CA certificate, if you provide a self-signed certificate or a certificate signed by an unknown authority. Enter the CA certificate in the PEM format for verification.
-
Client certificate in the PEM format that you want to use to authenticate requests to your destination. If you want to use mutual authentication, provide both the client certificate and the client key.
-
Client key you want to use to authenticate to the backend server in the PEM (non-encrypted PKCS8) format. If you want to use mutual authentication, provide both the client certificate and the client key.
When enabling mTLS authentication for a custom destination, configure the endpoint for all settings required for authentication with a valid client certificate.
-
-
Optionally, go to Custom header and provide the details of the custom header for the log file:
-
In Content type, set the content type to pass in the log file header. You can choose application/json or application/json; charset=utf-8 for destinations that require the
charset
parameter. -
If your destination accepts requests only with certain headers, enter the Custom header name and Custom header value. The custom header name can contain the alphanumeric, dash, and underscore characters.
-
Forbidden custom header values
DataStream 2 does not support custom header user values containing:
- Content-Type
- Encoding
- Authorization
- Host
- Akamai
- Click Validate & Save to validate the connection to the destination and save the details you provided.
Activate the stream
- In the Summary tab, review the stream details you provided earlier, including monitored properties, data set fields to log, and the destination to send log files.
- Check the Activate stream upon saving box to deploy the stream and activate it on the production network up to an hour minutes after saving, or leave the box unchecked and Activate a stream later.
- Click Save stream to save the stream for later. The stream starts activating if you checked the Activate stream upon saving box earlier.
Enable DataStream behavior in Property Manager
Activating a stream takes up to an hour, but it will start gathering and streaming data only after you add and enable the DataStream behavior to the default rule in your property configuration in Property Manager. You can set the sample percentage of data (0-100) for this property for your streams using this behavior.
For details on this step, check Enable the DataStream behavior.
Multiple streams
For properties exceeding 100 billion hits per month in traffic, reach out to the Akamai support team for best practices. If needed, you can conveniently Clone a stream with the same destination configuration.
Updated 6 months ago