Export SIA logs

You can export event and activity logs at a large volume from ​SIA​ and import them into Linode Object Storage or Microsoft Azure Blob Storage. Log data is stored in your Linode S3-compatible object storage bucket or in your Azure blob.

To configure this feature:

  • On the Connection Info page (Clients & Connectors > Connection Info), you specify the destination settings for Linode or Azure.
  • On the Scheduled Reports & Notifications page (Reports >Scheduled Reports & Notifications), you select the logs that you want to export. You can export data from the Threat Events, Access Control, and DNS Activity reports. If your organization uses SIA proxy, you can export data from the Proxy Activity report. To learn more about the data that is exported, see Exported log data.

After these settings are configured, Linode or Azure receives logs. The following applies:

  • This operation exports data at 10 minute intervals.
  • By default, data is compressed into three separate CSV files for each log type. These files include DNS Activity, Proxy Activity, and All Events.
  • For Linode, you generate an encryption key that's used to encrypt log data. You must use this key to decrypt the log data as well. For more information, see Decrypt log data stored on Linode.

📘

This feature is currently in limited availability. To use this feature, contact your ​Akamai​ account representative.

Log filename convention

Logs are compressed and exported into your Linode S3 bucket or your Azure blob storage based on the export settings you configure in ​SIA​ (Clients & Connectors > Connection Info). Logs are organized into folders that indicate the type of log that's exported.

Exported logs have this folder structure and use this filename convention:

<logType>/<timestamp><internalKey><partNumber>.<format>

where:

  • <logType> is the type of logs that were exported. This area may show DNS_ACTIVITY, PROXY_ACTIVITY, or EVENTS.
  • <timestamp> is the epoch timestamp in milliseconds when data was exported from ​SIA​.
  • <internalKey> is a number used for internal purposes only. You can ignore this value.
  • <partNumber> indicates the log file number in case the file is divided into multiple parts. For example, if the part number is 2, logs were continued in a second file. At this time, a single log file can have a maximum of 5 million rows.
  • <format> is the file format. CSV files are compressed into a gzip (.gz) file.

The following is an example of the folders where you can find logs and the log filename:

DNS_ACTIVITY/1720985285000_15_1.csv.gz

Exported log data

You can export log data from these ​SIA​ reports:

DNS activity

This data is exported into the DNS activity log file.

Data FieldDescription
Detection timeTime the request occurred.
DomainDomain that was requested
ActionPolicy action that was taken on request. Indicates whether the request was allowed or blocked.
Query TypeDNS resource record type associated with the request.
ReasonIndicates how the activity was detected. For example, it may have been detected based on policy, custom list, or more.
ListIf applicable, this value indicates the custom list that applies to the activity.
CategoryCategory assigned to the activity. If the category is a threat, a threat category is listed.
Observed categoriesAdditional categories that were observed for the activity.
PolicyName of policy that applies to activity.
LocationName of the location that’s associated with the activity.
Sub-locationIf applicable, this is the sub-location that’s assigned to activity.
Source IPExternal IP address for DNS activity.
Internal IPThe IP address of the internal client device.
Device Name or IDName of the device associated with the activity.
Internal Client NameName of the internal client device.
Onramp TypeIndicates how a request was directed to ​​SIA​​ Proxy.
One of these values may appear:


  • dns. Indicates DNS event was forwarded to ​​SIA​ Proxy.

  • web. Indicates web (HTTP and HTTPS) request was forwarded to the full web proxy.

  • onramp_dns. Indicates that risky HTTP and HTTPS traffic was forwarded to the selective proxy.

  • etp_client. Indicates the request was directed to ​​SIA​ Proxy as a result of ​the client​.

  • etp_offnet_client. Indicates the request was directed to ​​SIA​ Proxy as a result of the client​. In this case, ​the client was off the corporate network.

  • explicit_proxy_tls. Indicates the request was directed to ​​SIA​​ Proxy as a result of an on-premises proxy configuration.

Proxy activity

This data is exported into the proxy activity log file.

Data FieldDescription
Detection timeTime the request occurred.
DomainDomain that was requested
URLURL or path that was requested.
HTTP MethodAction that’s performed during a request. May indicate a POST or GET.
ActionIndicates the policy action that was performed on the activity.
ReasonInforms how traffic was identified.

Any of these reasons may appear:
  • ​Akamai​ Intelligence. Indicates traffic was identified by ​Akamai​ or a threat category.
  • Customer Domain Intelligence. Indicates traffic was found for a domain based on a list configuration.
  • Customer URL Intelligence. Indicates traffic was found for a URL based on a list configuration.
  • Sandbox-Dynamic Analysis. Indicates traffic was found with dynamic malware analysis.
  • AV scan. Indicates traffic was found with inline payload analysis.
  • Data Leakage Prevention. Indicates traffic was found as a result of a DLP configuration.
Additionally, if traffic was detected as a result of AVC, these reasons may also be listed depending on the policy action assigned to these areas:

  • Application Risk Level. Indicates traffic was detected based on the risk levels associated with the policy.
  • Category. Indicates traffic was detected based on the category or categories associated with the policy.
  • Application category operation. Indicates traffic was detected based on the category operations associated with the policy.
  • Application. Indicates traffic was detected based on applications associated with the policy.
  • Application Operation. Indicates traffic was detected based on application operations associated with the policy.
ListsIndicates what list matches this activity. Lists may include a custom list or an intelligence list associated with a threat category.
CategoriesCategory assigned to the activity. If the category is a threat, a threat category is listed.
PolicyName of policy that applies to activity.
LocationName of the location that applies to the activity.
SublocationIf applicable, the name of the sub-location that applies to activity.
Application NameName of the application that applies to the activity.
Application OperationName of the application operation that applies to this activity.
Source IPExternal IP address associated with the activity.
Destination IPIP address that was requested.
Internal IPIP address of the internal client device that was detected by Security Connector.
Internal Client NameName of the internal client device that was detected by Security Connector.
Onramp TypeIndicates how activity was directed to ​SIA​ Proxy.

One of these values may appear:
  • dns. Indicates DNS activity was forwarded to ​SIA​ Proxy.
  • web. Indicates web (HTTP and HTTPS) request was forwarded to the full web proxy.
  • onramp_dns. Indicates that risky HTTP and HTTPS traffic was forwarded to the selective proxy.
  • etp_client. Indicates the request was directed to ​SIA​ Proxy as a result of the client.
  • etp_offnet_client. Indicates the request was directed to ​SIA​ Proxy as a result of the client. In this case, the client was off the corporate network.
  • explicit_proxy_tls. Indicates the request was directed to ​SIA​ Proxy as a result of an on-premises proxy configuration.

Threat and access control events

Threat events and access control events are exported into the Events log file.

Data FieldDescription
Detection TimeTime the request occurred.
OriginIndicates whether the event was discovered as a result of ​SIA​ DNS or ​SIA​ Proxy
DomainDomain that was requested.
ActionPolicy action that was taken on request. Indicates whether the request was allowed or blocked.
Query TypeDNS resource record type associated with the request.
Event TypeIndicates whether the event was a threat event or an access control event.

  • If aup is shown, the event is an access control event.
  • If security is shown, the event is a threat event.
ReasonInforms how an event was identified.

Any of these reasons may appear:

  • ​Akamai​ Intelligence. Indicates the event was identified by ​Akamai​ or a threat category.
  • Customer Domain Intelligence. Indicates the event was found for a domain based on a list configuration.
  • Customer URL Intelligence. Indicates the event was found for a URL based on a list configuration.
  • Sandbox-Dynamic Analysis. Indicates the event was found with dynamic malware analysis.
  • AV scan. Indicates the event was found with inline payload analysis.
  • Data Leakage Prevention. Indicates the event was found as a result of a DLP configuration.
Additionally, if the event was detected as a result of AVC, these reasons may also be listed depending on the policy action assigned to these areas:

  • Application Risk Level. Indicates the event was detected based on the risk levels associated with the policy.
  • Category. Indicates the event was detected based on the category or categories associated with the policy.
  • Application category operation. Indicates the event was detected based on the category operations associated with the policy.
  • Application. Indicates the event was detected based on applications associated with the policy.
  • Application Operation. Indicates the event was detected based on application operations associated with the policy.
ListsList that identified the threat as an event. This list can be a custom list or a threat category.
CategoriesThe overall category of the event.

For a threat event, categories can be Malware, Phishing, C&C, DNS Exfiltration, Deny List, or Other (if assigned to a custom list)
ThreatsName of the threat. If a specific name for a threat does not appear, ​SIA​ shows a name that classifies the threat.

These classifications include:

  • Customer Lists. Domains or IP addresses in a custom list. The domains or IP addresses in these lists are defined by your organization.
  • Known Phishing. Domains or URLs that are used in a social engineering attack to fraudulently obtain personal or classified information. A phishing scam deceives victims to performing an activity that compromises their machine or reveals sensitive information.
  • Known Malware. Domains or URLs that direct victims to malicious websites or are used by applications to harm a network. Malware steals confidential data, compromises data integrity, and disrupts data availability.
  • Known CNC. Domains or URLs that are used for C&C communication. A C&C threat is used to steal data, distribute malware, and disrupt services.
  • File Sharing. Domains or URLs of file sharing services.
  • Aged Out. Indicates the domain was tracked as a threat for some time and it may still be a threat. If the proxy is enabled, the proxy determines whether the domain is still a threat.
  • Generic Risky. Indicates there's risk that the domain may be malicious. If the proxy is enabled, the proxy determines whether it is malicious.
  • Unclassified Indicates a threat is not yet classified by ​SIA​.

This field appears for threat events only.
SeverityIndicates the severity level. For more information, see Severity levels.

This field appears for threat events only.
Application NameWeb application that violated access control settings in ​​SIA​​ policy. For more information, see Application visibility and control.
Application RiskRisk level associated with a web application that violated ​access control settings in ​SIA​ policy. For more information, see Application visibility and control.
Application OperationApplication operation that violated access control settings in ​​SIA​​ policy. For more information, see Application visibility and control.
File NameName of the file that was involved in the event.
File TypeMIME file type that is downloaded or uploaded. An administrator may assign the block or monitor action to this file type in a policy.
PolicySecurity policy or set of rules that’s associated with the event.
LocationLocation where the event originated from.
SublocationIf applicable, the name of the sub-location where the event originated from.
Source IPExternal IP address of traffic. This is likely the IP address that’s assigned to a location as a result of NAT.
Destination IPIP address that was requested.
Internal IPInternal IP address of the user’s device.
Device Name/IDName or ID of the user’s device.
Internal Client NameInternal client name of machine that’s detected by DNS Forwarder.
Onramp TypeIndicates how a request was directed to ​SIA​ Proxy.

One of these values may appear:
  • dns. Indicates DNS activity was forwarded to ​SIA​ Proxy.
  • web. Indicates web (HTTP and HTTPS) request was forwarded to the full web proxy.
  • onramp_dns. Indicates that risky HTTP and HTTPS traffic was forwarded to the selective proxy.
  • etp_client. Indicates the request was directed to ​SIA​ Proxy as a result of the client.
  • etp_offnet_client. Indicates the request was directed to ​SIA​ Proxy as a result of the client. In this case, the client was off the corporate network.
  • explicit_proxy_tls. Indicates the request was directed to ​SIA​ Proxy as a result of an on-premises proxy configuration.

Configure log export settings

Complete this procedure to configure the destination of ​SIA​ logs.

To define what type of logs are exported, see Enable or disable the export of specific logs.

Before you begin:

  • If you are exporting logs to Linode:
    • Make sure you’ve created a storage bucket in the Linode platform. For instructions, see Create and manage buckets in the ​Akamai​ cloud computing documentation.
    • Make sure you have your access key and secret key. For more information, see Get started with object storage in the ​Akamai​ cloud computing documentation.
    • Generate a 32-byte string in hexadecimal format that you can use as an encryption key for logs. You can provide any unencoded 32 byte string. You can use OpenSSL to generate this string with the following command:
      openssl rand -hex 16

      🚧

      Linode destroys encryption keys immediately after your data is encrypted. Object Storage data that is encrypted with SSE-C is unrecoverable without your encryption key. Make sure you save the encryption key.

  • If you are exporting logs to Azure:

To export logs:

  1. In the Threat Protection menu of Enterprise Center, select Clients & Connectors > Connection Info.
  2. Go to the Log Export Settings.
  3. In the Destination menu, select Linode S3 Bucket or Azure Blob Storage.
  4. If you selected Linode S3 Bucket, complete these steps:
    1. In the Access Key ID field, enter the access key label.
    2. In the Secure Access Key field, enter the access key.
    3. In the Endpoint field, enter the URL for the Linode bucket.
    4. In the Bucket Name field, enter the name of your Linode bucket.
    5. In the Encryption token field, enter the encryption key that you generated.
  5. If you selected Azure Blob Storage, complete these steps:
    1. In the Account Name field, enter the name of your Azure storage account.
    2. In the Secret Key field, enter the Azure access key.
    3. In the Blob field, enter the name of your container where you want to store exported log files.
    4. In the Tenant ID field, enter the tenant ID that is associated with your organization’s account.
  6. Click Save.

Next Step:

Select the logs that you want to export. For instructions, see Enable or disable the export of specific logs.

Enable or disable the export of specific logs

Complete this procedure to enable or disable the export of a specific log type. CSV files with log data can be exported for DNS activity, proxy activity, and event data.

The user interface indicates that logs are exported to the destination you configured on the Connection Info page.

Before you begin:

Make sure you configure the log export settings. For more information, see Configure log export settings.

To enable or disable the export of specific logs:

  1. In the Threat Protection menu of Enterprise Center, select Reports > Scheduled Reports & Notifications.
  2. Expand the Logs section.
  3. To enable the export of a specific log, do the following:
    • To export DNS activity logs, turn on the toggle for DNS Activity. Make sure you wait for the success message that appears after enabling this setting before turning on a toggle for another log.
    • To export Proxy activity logs, turn on the toggle for Proxy Activity. Make sure you wait for the success message that appears after enabling this setting before turning on a toggle for another log.
    • To export logs on all events, turn on the toggle for All Events. Make sure you wait for the success message that appears after enabling this setting before turning on a toggle for another log.
  4. To disable the export of a specific log, do the following:
    • To disable the export of DNS Activity logs, turn off the toggle for DNS Activity.
    • To disable the export of Proxy Activity logs, turn off the toggle for Proxy Activity.
    • To disable the export of All events logs, turn off the toggle for All Events.

Next Steps:

If you are storing log data on Linode, you will need to decrypt log data. For more information, see Decrypt log data stored on Linode.

Decrypt log data stored on Linode

You use the encryption key that you generate as part of Linode object storage setup to encrypt the data that’s in your bucket. After it’s encrypted, you can decrypt data with the same encryption key. To decrypt your log data, use one of these methods:

Create and run a python script for decryption

Complete this procedure to a create and run a python script that decrypts log data that's stored in your Linode bucket.

Before you begin:

  • Make sure you are running Python 3.4 or later on your machine. To check your version, enter this command: python3 --version
  • Make sure you’ve generated an object storage key pair.
  • Make sure you have the encryption key that was used to encrypt logs.

To create a python script for decryption:

  1. In a code editor, open a new file with the name example.py. In the file, enter enter the following:

    #!/usr/bin/env python
    
    import boto3
    import os
    import gzip
    import shutil
    import logging
    
    # Static configuration
    ENDPOINT_URL = "<BUCKET_URL>"
    AWS_ACCESS_KEY_ID = "<ACCESS_KEY_ID>"
    AWS_SECRET_ACCESS_KEY = "<SECRET_ACCESS_KEY>" 
    ENCRYPTION_KEY = "<ENCRYPTION_KEY>" 
    BUCKET = "<BUCKET_NAME>"
    SOURCE_FOLDER = "<LINODE_FOLDER>" 
    DESTINATION_DIR = '<LOCAL_PATH>' 
    
    # Set up logging
    logging.basicConfig(level=logging.INFO)
    
    # Ensure the destination directory exists
    os.makedirs(DESTINATION_DIR, exist_ok=True)
    
    # Create an S3 client
    client = boto3.client(
        "s3",
        aws_access_key_id=AWS_ACCESS_KEY_ID,
        aws_secret_access_key=AWS_SECRET_ACCESS_KEY,
        endpoint_url=ENDPOINT_URL
    )
    
    def list_s3_files():
        logging.info(f"Listing files in S3 folder: {SOURCE_FOLDER}.")
        response = client.list_objects_v2(Bucket=BUCKET, Prefix=SOURCE_FOLDER + '/')
        
        files = []
        if 'Contents' in response:
            for obj in response['Contents']:
                files.append(obj['Key'])
        return files
    
    def process_and_copy_file(s3_key):
        local_filename = os.path.basename(s3_key)
        dest_path = os.path.join(DESTINATION_DIR, local_filename)
        
        logging.info(f"Downloading encrypted file from S3: {s3_key}.")
        with open(dest_path, 'wb') as f:
            client.download_fileobj(BUCKET, s3_key, f,
                                    ExtraArgs={"SSECustomerKey": ENCRYPTION_KEY, "SSECustomerAlgorithm": "AES256"})
    
        # Check if the file is compressed and decompress if necessary
        if dest_path.endswith('.gz'):
            logging.info(f"Decompressing the file: {dest_path}.")
            with gzip.open(dest_path, 'rb') as f_in:
                with open(dest_path[:-3], 'wb') as f_out:
                    shutil.copyfileobj(f_in, f_out)
            os.remove(dest_path)  # Remove the original compressed file
            dest_path = dest_path[:-3]  # Update path to decompressed file
        
        logging.info(f"File processed and saved to {dest_path}.")
    
    if __name__ == "__main__":
        try:
            s3_files = list_s3_files()
            for s3_key in s3_files:
                process_and_copy_file(s3_key)
        except Exception as e:
            logging.error(f"An error occurred: {e}")	
    

    where:

    • <ACCESS_KEY_ID> is the Linode access key ID.
    • <SECRET_ACCESS_KEY> is the Linode access key.
    • <BUCKET_URL> is the URL of the bucket’s cluster. All buckets are hosted by a unique cluster. For example, us-east-1.linodeobjects.com.
    • <ENCRYPTION_KEY> is your 32-byte encryption key.
    • <BUCKET_NAME> is the Linode bucket name.
    • <LINODE_FOLDER> is the path to the Linode folder where you want to copy all the files.
    • <LOCAL PATH> is the path on your local machine where you want to store decrypted log files.
  2. Enter this command to make sure that your script file is executable:
    chmod +x example.py

  3. Enter this command to run the script:
    ./example.py
    Decrypted files are available in the provided path on your local machine.

Use rclone to decrypt log data

Rclone is a command line tool used for syncing files to remote services. You can use this tool to download decrypted log data by copying data to a local directory on your machine. Alternatively, you can also configure another location, such as a different directory or bucket in your Linode storage or a location in another cloud storage service. These steps apply to Linux or macOS.

  1. Download and install rclone on Linux and macOS .

  2. Configure rclone with Linode object storage bucket information. Note that as part of this procedure:

    • Choose Other from the list of storage types. Do not select Linode.
    • Enter your Linode object storage access key ID and secret access key when prompted for AWS credentials.
    • For the Access Control List step, select 1 or private.
  3. Go to your config file. You can typically find the config file in this location:
    /<user_directory>/.config/rclone/rclone.conf
    As a result of your configuration, your config file should look like this:

    [test]
    type = s3
    provider = Other
    access_key_id = <ACCESS_KEY_ID>
    secret_access_key = <SECRET_ACCESS_KEY>
    endpoint = <ENDPOINT_URL>
    acl = private
    

    where:

    • <ACCESS_KEY_ID> is the Linode access key ID.
    • <SECRET_ACCESS_KEY> is the Linode access key.
    • <ENDPOINT_URL> is the URL for the region where your Linode bucket is located.
  4. Add these two lines to the config file for the location you just configured:

    sse_customer_algorithm = AES256
    sse_customer_key = <ENCRYPTION_KEY>
    

    where <ENCRYPTION_KEY> is the 32-byte encryption key you’ve configured for Linode and provided in SIA for log export.

  5. If you want to copy log files to a location on your local machine, run this command on your machine. Log files are decrypted as part of this operation.
    rclone copy test:<bucket_name>/<path_to_files> <path_on_local_machine> -P
    where:

    • <bucket_name> is the name of your Linode bucket.
    • <path_to_file> is the path in Linode to your log files.
    • <path_on_local_machine> is the path on your local machine where you want to copy the files.
  6. If you want to configure another storage area in rclone for your decrypted files, do the following:

    1. Configure another remote location with rclone. Repeat step 2 and configure an alternative configuration with these settings:

      [alternative]
      type = s3
      provider = Other
      access_key_id = <ACCESS_KEY_ID>
      secret_access_key = <SECRET_ACCESS_KEY>
      endpoint = <ENDPOINT_URL>
      

      where:

      • <ACCESS_KEY_ID> is the Linode access key ID.
      • <SECRET_ACCESS_KEY> is the Linode access key.
      • <ENDPOINT_URL> is the URL for the region where your Linode bucket is located.

    📘

    Make sure the alternative configuration does not have acl, sse_customer_algorithm, and sse_customer_key values.

    1. Run this command to transfer your decrypted data to an alternative location. You can configure another directory or bucket in Linode, or you can specify another cloud location.

    rclone copy test:<bucket_name>/<path_to_files> alternative:<bucket_name>/<new_path>

    where:

    • <bucket_name> is the name of your Linode bucket. For the alternative location, this can be your Linode bucket or another container in a cloud service.

    • <path_to_file> is the path to your log files.

    • <new_path> is the path where you want to place decrypted log files. If you are placing files in the same Linode bucket as your encrypted log files, this can be a new directory in your bucket.

      📘

      You can use the sync or move commands instead of the copy command. To learn more about these rclone commands, see rclone copy , rclone sync , or rclone move .