DataStream 2 lets you append additional data fields such as headers, cookies, or any performance data to each log line. The value of the data is maximum 128 bytes and is typically based on a dynamically generated built-in system variable. To include a custom field in your stream, you need to define it in the Log Request Details behavior in Property Manager or the Property Manager API.

Using this behavior, you can control logging the value you set for the custom log field, what cookie information you want to log, and which of the following headers should be logged part of your logs:

  • User-Agent

  • Accept-Language

  • Cookie

  • Referer

  • Edge Server IP

  • X-Forwarded-For

Follow these steps to add a custom field to your stream:

  1. Add or edit the Log Request Details behavior in the property configurations for which you want to collect a custom log field. Configure the customLogField to log a custom field.

  2. Configure your stream's datasetFieldIds array to include the identifier of a custom data field. See Create a stream or Edit a stream.

Dynamic time variables

DataStream 2 lets you use dynamic time variables in folder paths where you store logs and names of log files that you upload to your Amazon S3, Azure Storage, and Google Cloud Storage destinations.

Use {} to enter a dynamic variable. These are supported dynamic variables:

  • %Y for a year. For example, 2021.
  • %m for a month (01-12). For example, 03.
  • %d for a day (01-31). For example, 31.
  • %H for an hour (00-23). For example, 15.

You can combine static values and dynamic variables in a string of up to 255 characters in the path member to point to the folder path where you want to store logs. On sending a log file to this path, the system resolves dynamic variables into the current date, time, and hour in the UTC standard. Multiple dynamic variables separated by / within one {} create separate folders. For example, {%Y/%m/%d/} creates these folders 2020/10/05. Multiple variables joined without a separator create one folder. For example, {%Y}{%m}{%d} creates a 20201005 folder.

Here are examples of valid paths in connector configurations and the folder paths created in destinations:

PathFolder path
logs/{%Y/%m/%d}logs/2022/10/27
{%m}-logs/diagnostics05-logs/diagnostics
diagnostics/{%Y /{%m}/{%d}{%H}/diagnostics/2022/11/0516

Log files that the system uploads to your destination follow this naming pattern: uploadFilePrefix-{random-string}-{epoch-timestamp}-{random-string}-uploadFileSuffix. You can customize the uploadFilePrefix and uploadFileSuffix values of these files.

  • You can use static values and dynamic variables in a string of up to 200 characters in uploadFilePrefix names of log files that you upload to destinations. On sending a log file, the system resolves dynamic variables into the current date, time, and hour. You can use multiple dynamic values separated by -, _, or no separator inside or outside the {} regions. Filename prefixes don't allow . characters, as using them may result in errors and data loss. If unspecified, the uploadFilePrefix value defaults to ak.

  • You can use static values in a string of up to 10 characters in uploadFileSuffix names of log files that you upload to destinations. Filename suffixes don't allow dynamic values, and ., /, %, ? characters, as using them may result in errors and data loss. If unspecified, the uploadFileSuffix value defaults to ds.

Here are examples of valid prefix and suffix names in log files:

PrefixSuffixFilename
{%Y}-{%m}-{%d}akam2022-10-27-rps79rkvx-1666884947-dkmzsi6z8-akam
diagnostics{%Y-%m-%d}logsdiagnostics2022-12-01-8ds3lufkh-1669908947-m1onxoa16-logs
{%m}-diagnosticsdelivery12-diagnostics-dk4j0sh3m-1669856400-9kv08v9oy-delivery
upload-{%m_%d_%H}-filedataupload-04_23_18-file-gao0pip6y-1650736800-981bz2ipd-data
UnspecifiedUnspecifiedak-ae47rr5a8-1650736800-ae47rg6hu-ds