Stream logs to Sumo Logic
DataStream 2 supports sending log files to Sumo Logic to help you make data-driven decisions and reduce the time to investigate security and operational issues.
For security reasons, DataStream 2 sends logs over TLS even if Sumo Logic policies allow insecure requests.
The custom header feature allows you to optionally choose the content type passed in the log file, and enter the name and value for the header that your destination accepts. See what HTTP headers you can use for Sumo Logic.
Before you begin
In Sumo Logic, configure an HTTP logs and metrics source and configure your Sumo Logic URL endpoint to upload log data. See Sumo Logic source configuration.
How to
-
In Destination, select Sumo Logic.
-
In Display name, enter a human-readable name description for the destination.
-
In Endpoint, enter an HTTP source address where you want to send logs. The endpoint URL should follow the
https://[SumoEndpoint]/receiver/v1/http
format. See Uploading data to an HTTP source in Sumo Logic. -
In Collector code, enter the unique HTTP collector code from your Sumo Logic endpoint URL, that is the last string from the URL in the
https://[SumoEndpoint]/receiver/v1/http/[UniqueHTTPCollectorCode]
format.
Keep your account details safe
The full Sumo Logic endpoint URL can contain the collector code, but you should enter it separately in the Collector code field to hide your Sumo Logic account details.
-
If you want to send compressed gzip files to your destination, check the Send compressed data box.
-
Click Validate & Save to validate the connection to the destination and save the details you provided.
As part of this validation process, the system uses the provided credentials to push a sample request to the provided endpoint to validate the write access. In case you chose the Structured log format, the sample data appears in the
0,access_validation
format. For JSON logs, the data follows the{"access_validation":true}
format. You can see the data only if the destination validates, and you can access the destination storage.
Additional options
- Optionally, click Additional options, and provide the details of the Custom header for the log file:
- In Content type, set the content type to pass in the log file header. application/json is the only supported content type at this time.
- If your destination accepts only requests with certain headers, enter the Custom header name and Custom header value. See Supported HTTP headers in the Sumo Logic documentation.
Forbidden custom header values
DataStream 2 does not support custom header user values containing:
- Content-Type
- Encoding
- Authorization
- Host
- Akamai
- Click Validate & Save to validate the connection to the destination and save the details you provided.
Akamaized hostname as endpoint
This destination supports using Akamaized hostnames as endpoints to send DataStream 2 logs for improved security. When you create a property with a Sumo Logic endpoint URL as hostname, this property acts as a proxy between the destination and DataStream. As a result, you can filter incoming traffic to your destination endpoint by IP addresses using the Origin IP Access List behavior. That means only IP addresses that belong to your Akamaized property hostname can send logs to your custom destination. Using Akamaized hostnames as endpoints also requires enabling the Allow POST behavior in your property.
Once the property hostname works as a destination endpoint, you cannot monitor it as a property in this or another stream. If you already monitor a property in DataStream, you cannot use it as a destination endpoint.
To enable this feature:
-
Go to Property Manager and create a new property. We recommend choosing API Acceleration as the product. See Create a brand new property.
-
Set your Sumo Logic endpoint URL as the property hostname. See Redirect users to edge servers.
-
Go to ☰ > CDN > Properties or just enter Properties in the search box.
The Property Groups page opens.
-
Click the Property Name link to go to the property you created.
-
Activate the property on the production network. Only properties active on the production network can serve as DataStream destinations. See Activate property on production.
-
On the Property Details page, click the Version of your configuration that you want to access in Manage Versions and Activations.
The Property Manager Editor appears.
-
In the default rule, click Add Behavior, and select Origin IP Access List. Click Insert Behavior.
The Origin IP Access List behavior appears in the default rule.
-
Set the Enable slider in the Origin IP Access Control List behavior to On. Click Save.
-
Click Add behavior, and select Allow POST.
-
Click Insert Behavior.
The Allow POST behavior appears in the default rule.
-
Set the Behavior option in the Allow POST behavior to Allow.
-
Click Save.
Tip
You might need to additionally configure your property to ensure uninterrupted data flow. See Configuration best practices in the Property Manager guide for other behaviors you can configure in your property.
- Configure the firewall settings at your destination endpoint to allow access for IP addresses that belong to CIDR blocks for your Akamaized hostname. See the Origin IP Access List behavior for the list of IP addresses to put on the allow list.
After successfully configuring an Akamaized hostname as the destination endpoint, avoid editing an active property’s setup in Property Manager to ensure uninterrupted data flow. Adding, deleting, and editing hostnames and behaviors may cause unexpected behavior in the DataStream application.
We recommend setting up alerts that send e-mail notifications every time DataStream logs cannot be uploaded to your destination, so you can immediately troubleshoot issues with your property or destination configuration. See Set up alerts.
Updated about 1 year ago