DataStream 2 supports sending log files to an S3-compatible storage as a destination.
The stream uploads logs to this destination in a gzip-compressed file by default. For security reasons, DataStream sends log files over TLS even if S3 policies allow insecure requests.
- Create a dedicated S3-compatible storage bucket in your destination. Make note of the access key ID and client secret key for this bucket.
- Make sure to grant appropriate permissions (READ, WRITE, LIST etc.) to access contents in the S3-compatible destination.
- Depending on the destination, you may have to set up and manage server side encryption (SSE) in the container's settings.
In Destination, choose S3-compatible.
In Display name, enter a human-readable name for the destination.
In Host, enter the hostname of the S3-compatible destination.
In Bucket, enter the name of the bucket where you want to store logs.
Optional: Provide the Path to the folder within the bucket where you want to store logs. You can use Dynamic variables in folder paths for timestamps, stream ID, and stream version.
Forbidden and special characters
Make sure the Path doesn’t contain forbidden characters:
?, quotation marks, such as
„, and the grave accent or back tick character (`).
Characters that may require special handling include: the ASCII 00–1F hex (0–31 decimal) and 7F (127 decimal) ranges, non-printable ASCII characters and/or extended ASCII characters (128–255 decimal), spaces and their sequences.
You can use
%only for entering dynamic variables.
In Region, enter the region code where the bucket resides—for example,
In Access key ID, enter the access key associated with the S3-compatible bucket.
In Secret access key, enter the secret key associated with the S3-compatible bucket.
Click Validate & Save to validate the connection to the destination, and save the details you provided.
As part of this validation process, the system uses the provided access key identifier and secret access key to create a verification file in your S3-compatible destination folder. You can only see this file if the validation process is successful, and you have access to the S3-compatible bucket and folder that you're trying to send logs to.
Optional: In the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic variables.
For file name prefixes, you shouldn't use the
.character, as it may result in errors and data loss. File name suffixes don't support dynamic variables and the
Optional: Change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
Updated about 1 month ago