Detection methods
Determine which requests come from bots. Then categorize and label those bots based on their characteristics. That's bot detection in a nutshell.
Bots run the gamut from familiar search crawlers to sophisticated e-commerce fraudsters bent on evasion. Bot Manager provides an array of detection methods to help you spot all kinds of automated traffic.
- Akamai-categorized bots. Akamai categorizes known bots to make it easy for you to handle each type as you wish. Learn more. For unwanted bot categories, you could eventually set a deny action, after you monitor for a time. However, Akamai-categorized bots usually follow robots.txt directives, so instead of denying, consider implementing robots.txt entry that tells them not to visit your site.
- Custom-categorized bots. To track and handle bot traffic you know is hitting your site, you can create your own bot categories, then define specific bots within each category. For example, if you have internal tools or vendor bots that service your site, you can categorize and define these bots as friendly in order to allow them.
- Transparent detection. These methods detect many bots that don't voluntarily identify themselves in the user-agent like Akamai-categorized bots do. They evaluate various aspects of the request for traits of a bot, like incorrect header signatures and common bot-building frameworks. Transparent detection looks for dozens of request anomalies like out-of-order headers and browser version mismatches, then uses calibrated risk-scoring to trigger the action you set.
- Active detection. Active methods employ an interaction to confirm the request is coming from a web browser typically used only by a person.
- Behavioral detection (Premier only). Evaluating movement patterns and other interaction details unique to humans, behavioral detection methods cover specific transactional endpoints like login or checkout pages to unmask and fend off bots.
Updated over 2 years ago