Much has changed in the general understanding of bots since Imperva first revealed them to be responsible for the bulk of all website traffic. Today, it is not uncommon to find entire articles (including their own) dedicated to the study of individual bots: their HTTP footprints, points of origin and the nuances of their behavior. Collectively, however, these non-humans are still discussed in terms of two archetypes: Good Bots and Bad Bots.
Good bots are the worker bees of the Internet that assist its evolution and growth. Their owners are legitimate businesses who use bots to assist with automated tasks, including data collection and website scanning.
Bad bots, on the other hand, are the malicious intruders that swarm the Internet and leave a trail of hacked websites and downed services. Their masters are the bad actors of the cyber-security world, from career hackers to script kiddies. In their hands, bots are used to automate spam campaigns, spy on competitors, launch denial of service (DDoS) attacks or execute vulnerability scans to compromise websites on a large scale.
In the past, good and bad bots have always been responsible for most of the activity on our network. This year, however, Imperva saw a changing of the guard, with humans stepping in to become the Internet’s new majority.
This year’s report reveals:
- For the first time, humans were the ones responsible for the majority (51.5 percent) of all online traffic, up from 38.5% in 2013
- Good bot traffic decreased, from 31% in 2013 to 19.5% in 2015
- Bad bot traffic remained static, fluctuating around 30%