Stream logs to Sumo Logic

DataStream 2 supports sending log files to Sumo Logic to help you make data-driven decisions and reduce the time to investigate security and operational issues.

For security reasons, DataStream 2 sends logs over TLS even if Sumo Logic policies allow insecure requests.

The custom header feature allows you to optionally choose the content type passed in the log file, and enter the name and value for the header that your destination accepts. See what HTTP headers you can use for Sumo Logic.

Before you begin

In Sumo Logic, configure an HTTP logs and metrics source and configure your Sumo Logic URL endpoint to upload log data. See Sumo Logic source configuration.

How to

  1. In Destination, select Sumo Logic.

  2. In Display name, enter a human-readable name description for the destination.

  3. In Endpoint, enter an HTTP source address where you want to send logs. The endpoint URL should follow the https://[SumoEndpoint]/receiver/v1/http format. See Uploading data to an HTTP source in Sumo Logic.

  4. In Collector code, enter the unique HTTP collector code from your Sumo Logic endpoint URL, that is the last string from the URL in the https://[SumoEndpoint]/receiver/v1/http/[UniqueHTTPCollectorCode] format.


Keep your account details safe

The full Sumo Logic endpoint URL can contain the collector code, but you should enter it separately in the Collector code field to hide your Sumo Logic account details.

  1. If you want to send compressed gzip files to your destination, check the Send compressed data box.

  2. Click Validate & Save to validate the connection to the destination and save the details you provided.

    As part of this validation process, the system uses the provided credentials to push a sample request to the provided endpoint to validate the write access. In case you chose the Structured log format, the sample data appears in the 0,access_validation format. For JSON logs, the data follows the {"access_validation":true} format. You can see the data only if the destination validates, and you can access the destination storage.

Additional options

  1. Optionally, click Additional options, and provide the details of the Custom header for the log file:
    • In Content type, set the content type to pass in the log file header. application/json is the only supported content type at this time.
    • If your destination accepts only requests with certain headers, enter the Custom header name and Custom header value. See Supported HTTP headers in the Sumo Logic documentation.


Forbidden custom header values

DataStream 2 does not support custom header user values containing:

  • Content-Type
  • Encoding
  • Authorization
  • Host
  • Akamai
  1. Click Validate & Save to validate the connection to the destination and save the details you provided.

Akamaized hostname as endpoint

This destination supports using Akamaized hostnames as endpoints to send DataStream 2 logs for improved security. When you create a property with a Sumo Logic endpoint URL as hostname, this property acts as a proxy between the destination and DataStream. As a result, you can filter incoming traffic to your destination endpoint by IP addresses using the Origin IP Access List behavior. That means only IP addresses that belong to your Akamaized property hostname can send logs to your custom destination. Using Akamaized hostnames as endpoints also requires enabling the Allow POST behavior in your property.

Once the property hostname works as a destination endpoint, you cannot monitor it as a property in this or another stream. If you already monitor a property in DataStream, you cannot use it as a destination endpoint.

To enable this feature:

  1. Go to Property Manager and create a new property. We recommend choosing API Acceleration as the product. See Create a brand new property.

  2. Set your Sumo Logic endpoint URL as the property hostname. See Redirect users to edge servers.

  3. Go to > CDN > Properties or just enter Properties in the search box.

    The Property Groups page opens.

  4. Click the Property Name link to go to the property you created.

  5. Activate the property on the production network. Only properties active on the production network can serve as DataStream destinations. See Activate property on production.

  6. On the Property Details page, click the Version of your configuration that you want to access in Manage Versions and Activations.

    The Property Manager Editor appears.

  7. In the default rule, click Add Behavior, and select Origin IP Access List. Click Insert Behavior.

    The Origin IP Access List behavior appears in the default rule.

  8. Set the Enable slider in the Origin IP Access Control List behavior to On. Click Save.

  9. Click Add behavior, and select Allow POST.

  10. Click Insert Behavior.

    The Allow POST behavior appears in the default rule.

  11. Set the Behavior option in the Allow POST behavior to Allow.

  12. Click Save.



You might need to additionally configure your property to ensure uninterrupted data flow. See Configuration best practices in the Property Manager guide for other behaviors you can configure in your property.

  1. Configure the firewall settings at your destination endpoint to allow access for IP addresses that belong to CIDR blocks for your Akamaized hostname. See the Origin IP Access List behavior for the list of IP addresses to put on the allow list.

After successfully configuring an Akamaized hostname as the destination endpoint, avoid editing an active property’s setup in Property Manager to ensure uninterrupted data flow. Adding, deleting, and editing hostnames and behaviors may cause unexpected behavior in the DataStream application.

We recommend setting up alerts that send e-mail notifications every time DataStream logs cannot be uploaded to your destination, so you can immediately troubleshoot issues with your property or destination configuration. See Set up alerts.