Add log export to S3 object storage
After creating an MSL5 stream, you can enable the export of ingest HTTP logs to S3-compatible object storage for traffic analytics and monitoring purposes.
MSL5 supports S3-compatible object storage through the following interfaces:
- Akamai Linode Object Storage (Recommended)
- Other S3-compatible object storage such as AWS S3 buckets
1. Workflow
Follow these steps to add a log export.
- On the MSL5 main page, locate your stream in the list of streams.
- From the Log Export section at the bottom, click on the Add log export + button to open a new window.
- Fill in the Log export details.
| Field | Description |
|---|---|
| Name | Enter a descriptive name for the log export. |
| Status | Toggle between “Active” or “Inactive”. When “Active” is enabled, the log export will be active and running. |
| File prefix | This refers to the exported log file prefix and the default value is “msl”. The file prefix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) in a string of up to 10 characters. |
| File suffix | This refers to the exported log file suffix and the default value is “logs”. The file suffix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 100 characters. |
| Status | This refers to the exported log file suffix and the default value is “logs”. The file suffix supports static values (which consists of [0-9a-zA-Z], _ and - characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 100 characters. Supported Dynamic Variables (time-related variables are evaluated in UTC standard) includes: %Y for a year. For example, 2025. %m for a month (01-12). For example, 03. %d for a day (01-31). For example, 31. %H for an hour (00-23). For example, 15. %M for a minute (00-59). For example, 32. %streamId for the ID of the MSL5 stream. For example, 7d38d63f-632a-4926-bec9-7326c610776d. %contractId for the contract ID of the stream. For example, 222222. %cpTag for the CP Tag of the stream. For example, 333333. |
| Frequency | The frequency in seconds (value can be 30 or 60) after which the system bundles log lines into a file and sends it to a destination. |
- Specify the Destination configuration details.
| Field | Description |
|---|---|
| Compress Logs | Enables gzip compression for a log file sent to a destination. |
| Endpoint | The host of the S3-compatible object storage bucket. |
| Region | The physical storage location of your S3-compatible object storage bucket. |
| Bucket | The name of the S3-compatible object storage bucket. |
| Path | This refers to the path to the folder within your S3-compatible object storage bucket where you want to store logs. The path supports static values (which consists of [0-9a-zA-Z], _, - and / characters and must start with [0-9a-zA-Z]) and dynamic variables in a string of up to 200 characters. Supported Dynamic Variables (time-related variables are evaluated in UTC standard) includes: %Y for a year. For example, 2025. %m for a month (01-12). For example, 03. %d for a day (01-31). For example, 31. %H for an hour (00-23). For example, 15. %M for a minute (00-59). For example, 32. %streamId for the ID of the MSL5 stream. For example, 7d38d63f-632a-4926-bec9-7326c610776d. %contractId for the contract ID of the stream. For example, 222222. %cpTag for the CP Tag of the stream. For example, 333333. |
| Access Key | The access key identifier of the S3-compatible object storage bucket. |
| Secret Access Key | The secret access key identifier of the S3-compatible object storage bucket. |
- Click Validate & Save.
A unique Log Destination ID (UUID) will be generated automatically upon saving.
Log destination creation is limited as follows.
- Maximum log destinations per account: 10
- Maximum log destinations per stream: 2
Please contact the support team if additional log destinations are needed.
2. Uploaded File Path and Naming
Multiple log lines (each in JSON format) will be bundled into a file and sent to the destination.
- The uploaded file path in the bucket will be
<path>/ingest/. Theingest/suffix is fixed and appended automatically. - The uploaded file name will be
<file_prefix>-{epoch-timestamp-in-sec}-{random-string-in-six-digits}-<file_suffix>. - The uploaded file extension will be
.txtif the compression is disabled or.txt.gzif the compression is enabled.
Example File Path
For a stream with:
stream_id: 7d38d63f-632a-4926-bec9-7326c610776dcontractId: 222222cpTag: 333333bucket: app-bucketpath: app-folder/%streamId-%contractId-%cpTag/year=%Y/month=%m/day=%d/hour=%H/minute=%Mfile_prefix: msl-%mfile_suffix: %H-logs- Compression enabled
Resulting file path:
app-bucket/app-folder/7d38d63f-632a-4926-bec9-7326c610776d-222222-333333/year=2025/month=11/day=12/hour=18/minute=30/ingest/msl-11-1762971600-123456-18-logs.txt.gz
The backup stream ID (
7d38d63f-632a-4926-bec9-7326c610776d-b) will be logged according to the primary stream ID's log destination.
Updated about 2 hours ago
