Migration errors

When using the migration feature in the DataStream 2 API, it may return data objects that reflects common API responses to error and success cases, but also errors that you have to solve before migrating your streams using the Migrate streams operation. It also contains warnings about features partially or not supported in DataStream 2.

Running the Prepare migration payload operation returns a JSON object that may contain various messages, errors, warnings and notices related to your streams. In case of no errors or warnings, the API returns a 200 (OK) response. When the request is successful, but has errors and warnings to solve before the migration, the response code is 207, and the response body contains a validation array:

{
   "validation": {
      "message": {
         "type": "FIX_ALL_ERRORS",
         "title": "Fix errors before bulk stream migration",
         "detail": "There are 1 error(s) and 4 warning(s) for your stream(s). Fix all errors before making a request for bulk stream migration. Please pay attention to the warnings that inform about features partially or not supported in DataStream 2."
      },
      "errors": [
         {
            "detail": "DataStream 2 supports only one destination at this time. Select and pass at least one destination from the 'destinations' array, and provide values required for authentication marked with <ENTER VALUE>. For more details, refer to the Destinations topic in the migration document: {1} ",
            "title": "Single Destination Support",
            "type": "SINGLE_DESTINATION_SUPPORT"
         }
      ],
      "warnings": [],
      "notices": [
         {
            "detail": "DataStream 2 does not support the DataStream Buffer destination.",
            "title": "DataStream Buffer Not Supported",
            "type": "DATASTREAM_BUFFER_NOT_SUPPORTED"
         },
         {
            "detail": "Amazon S3 (Display name: S3Destination) path has been converted to the DataStream 2 dynamic variables format. Previous path: log/edgelogs/{year}/{month}/{day}. Updated path: log/edgelogs/{%Y}/{%m}/{%d}.",
            "title": "Amazon S3 dynamic variables converted in path",
            "type": "S3_DYNAMIC_PATH_CONVERSION"
         },
         {
            "detail": "For the Amazon S3 destination, the log file name prefix (default: ak) and suffix (default: ds) is configurable. To customize these values pass uploadFilePrefix, and uploadFileSuffix under deliveryConfiguration of the stream. For more details, refer to the Destinations topic in the migration document: {0}",
            "title": "Amazon S3 custom file name prefix and suffix support",
            "type": "S3_CONFIGURABLE_FILENAME_SUPPORT"
         }
      ]
   }

These errors and warnings concern various differences between DataStream 1 and DataStream 2 in terms of available destinations, data set fields and features. See DataStream 1 and DataStream 2 to check and compare them. You can also see Errors for general descriptions of the response codes.

Migration errors

Here are some common errors you may encounter in the validation array after running the Prepare migration payload operation. Check how to solve these errors before calling the Migrate streams operation:

Error messageSolution
SINGLE_DESTINATION_SUPPORTDataStream 2 currently supports only one destination per stream. Choose one destination for the new stream, collect the details needed for authentication and pass at least one destination from the destinations/commonDestinations array.

See Stream logs to a destination for details about each destination.
DESTINATION_AUTHENTICATION_DETAILS_REQUIREDSome authentication details for the destination are missing, indicated with <ENTER VALUE> in the payload. Fill the missing values before calling the Migrate streams operation.

You can check the Stream logs to a destination topic for details needed for each destination.
DESTINATION_MANDATORYWhile DataStream 1 supports streams with no destinations, DataStream 2 requires choosing at least one destination per stream.

See Stream logs to a destination for details about each destination.
DATASTREAM_BUFFER_NOT_SUPPORTEDDataStream 2 does not support DataStream Buffer as a destination to send logs. Provide another destination for your stream.

See DataStream 1 and DataStream 2 for the complete list of supported destinations.
DATADOG_DESTINATION_ENDPOINT_INVALIDProvide a valid Datadog endpoint URL in the http-intake.logs.datadoghq.com/v1/input or http-intake.logs.datadoghq.eu/v1/input format.

See Stream logs to Datadog for details.
SUMO_DESTINATION_ENDPOINT_INVALIDProvide a valid Sumo Logic endpoint URL in the https://[SumoEndpoint]/receiver/v1/http format.

See Stream logs to Sumo Logic for details.
S3_INVALID_DYNAMIC_PATHProvide a valid Amazon S3 folder path using dynamic time variables (such as %Y, %m, %d, and %H) in the field marked with <ENTER VALUE>.

See Stream logs to Amazon S3 and Dynamic time variables for details.
S3_DYNAMIC_PATH_TOO_LONGProvide a valid Amazon S3 folder path that doesn't exceed 255 characters after resolving the dynamic time variables.

See Stream logs to Amazon S3 and Dynamic time variables for details.
SPLUNK_ENDPOINT_NOT_SUPPORTEDProvide a valid Splunk raw endpoint URL in the https://<splunk-host>:8088/services/collector/raw format.

See Stream logs to Splunk for details.
DATA_SET_MANDATORYProvide at least one data set field supported in DataStream 2.

See Data set fields for differences between data set fields available in DataStream 1 and DataStream 2, and Data set parameters for the complete list of DataStream 2 data set fields with examples.