Add log export to S3 object storage

After creating an MSL5 stream, you can enable the export of ingest HTTP logs to S3-compatible object storage for traffic analytics and monitoring purposes.

MSL5 supports S3-compatible object storage through the following interfaces:

  • Akamai Linode Object Storage (Recommended)
  • Other S3-compatible object storage such as AWS S3 buckets

1. Workflow

Follow these steps to add a log export.

  1. On the MSL5 main page, locate your stream in the list of streams.
  2. From the Log Export section at the bottom, click on the Add log export + button to open a new window.
  3. Fill in the Log export details.
FieldDescription
NameEnter a descriptive name for the log export.
StatusToggle between “Active” or “Inactive”. When “Active” is enabled, the log export will be active and running.
File prefixThis refers to the exported log file prefix and the default value is “msl”. The file prefix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) in a string of up to 10 characters.
File suffixThis refers to the exported log file suffix and the default value is “logs”. The file suffix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 100 characters.
StatusThis refers to the exported log file suffix and the default value is “logs”. The file suffix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 100 characters.
Supported Dynamic Variables (time-related variables are evaluated in UTC standard) includes:
%Y for a year. For example, 2025.
%m for a month (01-12). For example, 03.
%d for a day (01-31). For example, 31.
%H for an hour (00-23). For example, 15.
%M for a minute (00-59). For example, 32.
%streamId for the ID of the MSL5 stream. For example, 7d38d63f-632a-4926-bec9-7326c610776d.
%contractId for the contract ID of the stream. For example, 222222.
%cpTag for the CP Tag of the stream. For example, 333333.
FrequencyThe frequency in seconds (value can be 30 or 60) after which the system bundles log lines into a file and sends it to a destination.
  1. Specify the Destination configuration details.
FieldDescription
Compress LogsEnables gzip compression for a log file sent to a destination.
EndpointThe host of the S3-compatible object storage bucket.
RegionThe physical storage location of your S3-compatible object storage bucket.
BucketThe name of the S3-compatible object storage bucket.
PathThis refers to the path to the folder within your S3-compatible object storage bucket where you want to store logs. The path supports static values (which consists of [0-9a-zA-Z], _, - and / characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 200 characters.
Supported Dynamic Variables (time-related variables are evaluated in UTC standard) includes:
%Y for a year. For example, 2025.
%m for a month (01-12). For example, 03.
%d for a day (01-31). For example, 31.
%H for an hour (00-23). For example, 15.
%M for a minute (00-59). For example, 32.
%streamId for the ID of the MSL5 stream. For example, 7d38d63f-632a-4926-bec9-7326c610776d.
%contractId for the contract ID of the stream. For example, 222222.
%cpTag for the CP Tag of the stream. For example, 333333.
Access KeyThe access key identifier of the S3-compatible object storage bucket.
Secret Access KeyThe secret access key identifier of the S3-compatible object storage bucket.
  1. Click Validate & Save.
📘

A unique Log Destination ID (UUID) will be generated automatically upon saving.

📘

Log destination creation is limited as follows.

  • Maximum log destinations per account: 10
  • Maximum log destinations per stream: 2

Please contact the support team if additional log destinations are needed.

2. Uploaded File Path and Naming

Multiple log lines (each in JSON format) will be bundled into a file and sent to the destination.

  • The uploaded file path in the bucket will be <path>/ingest/. The ingest/ suffix is fixed and appended automatically.
  • The uploaded file name will be <file_prefix>-{epoch-timestamp-in-sec}-{random-string-in-six-digits}-<file_suffix>.
  • The uploaded file extension will be .txt if the compression is disabled or .txt.gz if the compression is enabled.

Example File Path

For a stream with:

  • stream_id: 7d38d63f-632a-4926-bec9-7326c610776d
  • contractId: 222222
  • cpTag: 333333
  • bucket: app-bucket
  • path: app-folder/%streamId-%contractId-%cpTag/year=%Y/month=%m/day=%d/hour=%H/minute=%M
  • file_prefix: msl-%m
  • file_suffix: %H-logs
  • Compression enabled

Resulting file path:

app-bucket/app-folder/7d38d63f-632a-4926-bec9-7326c610776d-222222-333333/year=2025/month=11/day=12/hour=18/minute=30/ingest/msl-11-1762971600-123456-18-logs.txt.gz

📘

The backup stream ID (7d38d63f-632a-4926-bec9-7326c610776d-b) will be logged according to the primary stream ID's log destination.