2015 Bot Traffic Report: The Year Humans Reclaimed The Website Traffic Majority

Web Bots

For all that can be said about the advanced technology that goes into the internet, every great thing that is actually on the internet comes from humans. GIFs of Jennifer Lawrence’s facial expressions, recipes for cookies stuffed into brownies, Ted Talks you’re going to watch one of these days: all thanks to humans. It seems only fitting then that humans should reign supreme when it comes to making up internet traffic. In the past, however, it was bots both good and bad outnumbering people on any given website. Until this year, that is.

Taking Over After 2014

For as much time as it seems like we as a population spend on the internet, in years past it’s been bots outnumbering people on websites, especially small to medium-sized sites. In 2014, 44% of website traffic came from humans while 56% came from bots. This was a stronger human showing from 2013, when the split was 38.5% human traffic and 61.5% bots. The rise in human traffic has continued steadily, and according to Incapsula annual bot traffic report, people officially made up the majority of website traffic, taking the crown with 51.5%. Bots, of course, accounted for 48.5% of traffic.

A Closer Look At The 2015 Traffic Split

You have to keep in mind that the numbers in the Incapsula report are all relative. That means that when human traffic rises 7.5%, bot traffic must fall by the same amount, and so it did. But the 48.5% of 2015 traffic that comes from bots is divided into good bot and bad bot traffic, and the two groups didn’t lose traffic equally. Unfortunately, bad bots reign supreme in bot traffic, weighing in at 29%. This is a number that held steady from 2014. Good bot traffic is now lagging significantly behind at 19.5%, falling from 27% in 2014.

Good bots serve to create order out of the internet, working on behalf of things like search engines and social media sites to scan websites, collect data, and complete other automated tasks that make it easy for internet users to get where they’re going and find what they’re looking for. These are the bots you want on your website; the ones that help you rank in Google results and the like.

As far as bad bots go, in 2015 24.5% of them were found to be Googlebots or other legitimate bot impostors, designed to bypass security and take advantage of the access given to good bots. Imposter Googlebots are commonly used to launch DDoS attacks. Other types of malicious bots include scrapers, which steal content, spammers, which put those spam links into your comment sections or forums, and hacking tools, which are bots used for hackings or other intrusions.

People Power & Popularity

The rise in human traffic isn’t evenly distributed across all websites. Bots are still making up the vast majority of traffic on small and medium websites, with humans accounting for just 14.6% and 28.9% of all traffic on sites in those categories, respectively. Even on large websites, which pull in between 10 thousand and 100 thousand visitors per day, bots are still edging out people, with humans accounting for 49.1% of traffic.

The category of websites in which humans truly make their mark is on the major websites, the so-called Alexa MVPs that rake in anywhere from 100 thousand to millions of users every day. Think Facebook, Google, YouTube, Amazon, Wikipedia and Twitter. On these sites, people account for 60.3% of all traffic.

Where Have All The Good Bots Gone?

While the numbers may signal what looks like a sharp decrease in good bot traffic, what the Incapsula report is showing this year is actually a continuation of the traffic trends over the last few years. Human traffic has been steadily increasing, bad bot traffic is staying largely the same around the 30% mark, and good bot traffic is on the decline. Before falling 7.5% from 2014 to 2015, good bot traffic fell 4% from 2013 to 2014.

When Incapsula looked deeper into the numbers, they found that it isn’t that good bots are becoming any less active. Good bots just aren’t able to keep up with the increase in human traffic and bad bot traffic, thus their relative traffic percentage has fallen steadily. The positives and negatives of people

There’s a reason overall human traffic and bad bot traffic have both managed to increase enough to either hold steady or rise in terms of traffic share. That’s because people are behind bad bots, either individuals or small criminal/hacker groups, so as the number of people going online increases, so too does the number of people looking to use bots for malicious purposes. Good bots tend to have organizations behind them, ones that are interested in improving the internet. These organizations and their bots are equally interested in all content on the internet. A website’s popularity has no bearing on how often good bots will visit.

On the other hand, bad bots have, well, bad people behind them. As a website becomes more popular and attracts more human visitors, it will also attract more bad bot traffic. This is because when a website becomes popular, whether it’s because cyber criminals see an opportunity to steal data, steal personal information, plant malware, or launch a DDoS attack to take the site down, that website becomes a bad bot target due to its success.

Why You Need To Guard Against Bad Bots

Essentially, while you’re working away to improve your site’s search engine rankings and social media presence to attract more visitors, have a more successful website and make more money, you’re also attracting a larger number of malicious bots. The years of hard work you’ve put into your website can be undone by bad bots in an instant. Whether it’s a hacking or intrusion that steals data or a DDoS attack that takes down your website and downgrades the trust and loyalty of your visitors and users, the consequences of bad bot activity can reverberate for a long time.

Now that you know what’s out there, you need to take the steps necessary to protect the hard work and hard-earned money you’ve put into your website. Look into web application security that includes bot protection that differentiates between good bots, bad bots – and potentially bad bots – and treats them all accordingly.

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone

Benjamin Campbell

Benjamin Campbell is an accomplished and experienced freelance Web developer and writer who has featured in a number of high profile publications and Web sites. If he’s not building a new Web site you’ll find him listening to live music or at the coast surfing.