Stream logs to Azure Storage
Follow these steps to send DataStream 2 logs to Microsoft Azure Blob Storage.
Azure Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.
DataStream 2 uploads logs to Azure Storage in a gzip-compressed file. For security reasons, DataStream 2 sends log files over TLS even if Azure container policies allow insecure requests.
If you want to get improved aggregated metrics, you can use the new DataStream 2 SDK available in our GitHub repository. See this video to get to know how to use our SDK for Azure:
Before you begin
-
Before adding Azure Storage as a destination in DataStream 2, create an Azure Storage account in the Azure portal. See Microsoft's account creation documentation.
-
Create a dedicated container in an Azure region. See Quickstart in the Azure portal.
-
Make note of the access keys associated for your Azure account. See Manage your access keys.
-
Enable TLS transport in the Azure Storage portal to receive DataStream 2 logs.
-
Set up and manage server side encryption (SSE) in the container's settings. See Server side encryption for managed disks for Azure Storage.
Block blobs support
DataStream 2 currently supports streaming data only to block blobs. See Block blobs in Azure.
How to
-
In Destination, select Azure Storage.
-
In Name, enter a human-readable description for the endpoint.
-
In Storage account name, enter the name of the storage account where you want to store logs.
See Blob storage resources in Azure.
-
In Container name, enter the name of the container within your account where you want to store logs.
See Naming resources in Azure.
-
In Path, enter the path to the folder in the container where you want to store logs—for example,
logs
orlogs/diagnostics
. If the folders in the path don't exist, Azure creates them for you. You can use Dynamic variables for timestamps, stream ID, and stream version in folder paths.Avoid using
/
in paths, as it may cause uploading logs to a wrong folder. -
In Access key, enter either of the access keys associated with your Azure account.
See Authorization in Azure.
-
Click Validate & Save to validate the connection to the destination, and save the integration details you provided.
-
Optionally, in the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic variables.
For file name prefixes, you shouldn't use the
.
character, as it may result in errors and data loss. File name suffixes don't support dynamic variables and the.
,/
,%
, and?
characters. See Naming buckets in Azure. -
Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
-
Click Next.
Updated about 1 year ago