It's Not Just Raining; It's Hailing - Bots Are Clobbering Your Analytics

in Google Analytics
image

This year has been especially bad for those who want to use their web analytics as a key metric for understanding their business. The growth in non-human traffic (bots) is staggering and multiplying. It's beyond a simple annoyance. At this point, if Google doesn't figure out how to prevent it from at least wrecking your analytics data, it may eventually destroy the whole content ranking process Google uses.

I can't say I fully understand the reasoning behind bot traffic and what it's supposed to accomplish for the originator. It's not unusual to see 100's of hits on a website relegated to known bots. They really are web based cockroaches! They do nothing of value and they may over time be destroying your web ranking (I don't know this for sure but there is good reason to believe it's possible).

Remember the Sentinal's in The Matrix? If your website is the ship in that movie, the Sentinal's are the bots. They are eating at the outside of your website, trying to get in (and that ultimately may be the purpose). Currently they are harmless as far as your website infrastructure goes but who knows. More concerning is that these visits leave footprints and the one that may be affecting your ranking is bounce rate.

Here's an example of bot traffic on a new site:

free-share-buttons.com... 120 73 seconds
pornhub-forum.uni.me... 30 9 seconds
best-seo-offer.com... 12 0 seconds
site12.simple-share-buttons.com... 12 0 seconds
www.Get-Free-Traffic-Now.com... 10 0 seconds

That's a total of 184 hits on a site that only gets 700 hits in a month.

Google likes engaging websites. A bot hits your website and spends zero time (usually) on your site and does not visit multiple pages like a human being would. Now, if it's a random hit, that's probably OK. However if it's a hit based on a keyword search,  it could really mess your site up. Currently,  these appear to be random hits. But imagine the same programming trying to destroy Google. The programmer could program a bot to do a simple search and then hit all of the sites in the search. BOOM - Bounce rates go through the roof and Google algorithms start to question the validity of the results in the search engine results pages. I'm not saying it happens. I'm saying that it's not too much of a stretch.

For now, you have to be aware of the traffic and if you use your analytics data at all, you should be filtering the bot traffic. You can set filters on Google analytics. Here is a good blog on the subject if you want to get down and dirty with your own analytics. This won't keep the Sentinals from breaching the hull, but it will let you turn a blind eye to this currently innocuous traffic. 

http://www.lunametrics.com/blog/2014/08/07/bot-spider-filtering-google-analytics/

Addendum: there has been some talk among the "gurus" about using a new bot filter that Google has added. We are starting to use it and we think you should too.  Is there a chance that you could filter real traffic? Maybe... but ultimately you are using Analytics to track trends. If you are mired in what seems to be the never ending onslaught of bots, this is the best way to let you focus on what is real and ignore the rest. Here is how you select this filter (it's not in your Analytics Filter view - oddly). 

1. Login to your Google Analytics

2. Under "All Website Data" you should see something called "View Settings"

3. Toward the bottom of that selection, you'll see a checkbox for "Exclude all hits from known bots and spiders"

4. Check that box Google will take care of filtering bots for you. 

If we see any issues with this, we'll update 

Addendum:

We're finding that Google is not great at preventing the bots even when you follow this procedure. I found this solution and it works very well. (If you have difficulty with this, call us! We can help you with it.

https://moz.com/blog/stop-ghost-spam-in-google-analytics-with-one-filter

 

EVEN SIMPLER: For capturing local traffic, use the include filter and select Countries for the parameters. (you could also use the exclude and select countries to exclude). For local search in the US, we like this filter however SOME citations and directories could originate referrals from overseas. Not a big deal for business seeking to sell products and services in their local markets.