Stream logs to Azure Storage

Follow these steps to send DataStream 2 logs to Microsoft Azure Blob Storage.

Azure Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.

📘

DataStream 2 uploads logs to Azure Storage in a gzip-compressed file. For security reasons, DataStream 2 sends log files over TLS even if Azure container policies allow insecure requests.

If you want to get improved aggregated metrics, you can use the new DataStream 2 SDK available in our GitHub repository. See this video to get to know how to use our SDK for Azure:

Before you begin

📘

Block blobs support

DataStream 2 currently supports streaming data only to block blobs. See Block blobs in Azure.

How to

  1. In Destination, select Azure Storage.

  2. In Name, enter a human-readable description for the endpoint.

  3. In Storage account name, enter the name of the storage account where you want to store logs.

    See Blob storage resources in Azure.

  4. In Container name, enter the name of the container within your account where you want to store logs.

    See Naming resources in Azure.

  5. In Path, enter the path to the folder in the container where you want to store logs—for example, logs or logs/diagnostics. If the folders in the path don't exist, Azure creates them for you. You can use Dynamic variables for timestamps, stream ID, and stream version in folder paths.

    Avoid using / in paths, as it may cause uploading logs to a wrong folder.

  6. In Access key, enter either of the access keys associated with your Azure account.

    See Authorization in Azure.

  7. Click Validate & Save to validate the connection to the destination, and save the integration details you provided.

  8. Optionally, in the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic variables.

    For file name prefixes, you shouldn't use the . character, as it may result in errors and data loss. File name suffixes don't support dynamic variables and the ., /, %, and ?characters. See Naming buckets in Azure.

  9. Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.

  10. Click Next.