Stream logs to Google Cloud Storage
DataStream 2 supports sending log files to Google Cloud Storage. Google Cloud Storage is a cloud-based storage that lets you store unlimited amounts of data with low latency, high durability and worldwide accessibility.
Before you begin
To use Google Cloud Storage as a destination for your logs, you need to:
-
Create a Google Cloud project using the Google Resource Manager API and the Google Cloud Console. See Creating and managing projects in Google Cloud Storage.
-
Create a dedicated storage bucket in your Google Cloud account. See Creating storage buckets.
-
Create a service account under your project with the
storage.object.create
permission or Storage Object Creator role. See Service accounts. -
Create and download the service account key. See Creating and managing service account keys in Google Cloud Storage.
How to
-
In Destination, select Google Cloud Storage.
-
In Display name, enter a human-readable name description for the destination. The name can't be longer than 255 characters.
-
In Bucket, enter the name of the storage bucket you created in your Google Cloud account. See Bucket naming conventions for Google Cloud Storage.
-
In Project ID, enter the unique ID of your Google Cloud project.
-
Optionally, set the Path to the folder within your Google Cloud bucket where you want to store logs. You can use Dynamic variables in folder paths for timestamps, stream ID, and stream version.
Folder paths
In Google Cloud Storage, paths work as object names. When you enter a custom path, such as
akamai/logs/{%Y}
, Google Cloud Storage doesn't create newakamai
,logs
, and{%Y}
folders in the bucket. Instead, the objects are stored in one bucket and namedakamai/logs/{%Y}/filename
. See Object naming guidelines for details.
-
In Service account name, enter the name of the service account with the
storage.object.create
permission or Storage Object Creator role. -
In Private key, enter the
private_key
value from the JSON key type you generated and downloaded from your Google Cloud Storage account. You should enter your private key in the in the PEM format with break (\n
) symbols, e. g.-----BEGIN PRIVATE KEY-----\nprivate_key\n-----END PRIVATE KEY-----\n
. -
Click Validate & Save to validate the connection to the destination, and save the details you provided.
As part of this validation process, the system uses the provided access key identifier and secret to create a verification file in your GCS bucket, with a timestamp in the filename in the
Akamai_access_verification_[TimeStamp].txt
format. You can only see this file if the validation process is successful, and you have access to the Google Cloud Storage bucket that you are trying to send logs to. -
Optionally, in the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic variables.
-
Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
-
Click Next to go to the Summary tab.
Updated about 1 year ago