Stream logs to an S3-compatible destination

DataStream 2 supports sending log files to an S3-compatible storage as a destination.

The stream uploads logs to this destination in a gzip-compressed file by default. For security reasons, DataStream sends log files over TLS even if S3 policies allow insecure requests.

Before you begin

  • Create a dedicated S3-compatible storage bucket in your destination. Make note of the access key ID and client secret key for this bucket.
  • Make sure to grant appropriate permissions (READ, WRITE, LIST etc.) to access contents in the S3-compatible destination.
  • Depending on the destination, you may have to set up and manage server side encryption (SSE) in the container's settings.

How to

  1. In Destination, choose S3-compatible.

  2. In Display name, enter a human-readable name for the destination.

  3. In Host, enter the hostname of the S3-compatible destination.

  4. In Bucket, enter the name of the bucket where you want to store logs.

  5. Optional: Provide the Path to the folder within the bucket where you want to store logs. You can use Dynamic variables in folder paths for timestamps, stream ID, and stream version.

📘

Forbidden and special characters

Make sure the Path doesn’t contain forbidden characters: \, {, ^, }, %, [,], <, >, ~, #, |, \, ?, quotation marks, such as ', " or , and the grave accent or back tick character (`).

Characters that may require special handling include: the ASCII 00–1F hex (0–31 decimal) and 7F (127 decimal) ranges, non-printable ASCII characters and/or extended ASCII characters (128–255 decimal), spaces and their sequences.

You can use { }, and % only for entering dynamic variables.

  1. In Region, enter the region code where the bucket resides—for example, ap-south-1.

  2. In Access key ID, enter the access key associated with the S3-compatible bucket.

  3. In Secret access key, enter the secret key associated with the S3-compatible bucket.

  4. Click Validate & Save to validate the connection to the destination, and save the details you provided.

    As part of this validation process, the system uses the provided access key identifier and secret access key to create a verification file in your S3-compatible destination folder. You can only see this file if the validation process is successful, and you have access to the S3-compatible bucket and folder that you're trying to send logs to.

  5. Optional: In the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic variables.

    For file name prefixes, you shouldn't use the .character, as it may result in errors and data loss. File name suffixes don't support dynamic variables and the ., /, %, and ?characters.

  6. Optional: Change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.

  7. Click Next.