By “Fake”, I mean that this traffic it is not generated by humans, but by a myriad of scripts, bots and other automated processes. Some are well intentioned, but most are not. These busy little bees scan for vulnerabilities on your site, wordpress login pages, test scripts and a 1001 other things.
The only way to find out what all these creepy crawlers are up to is to analyze your website’s log file. However, the results can be overwhelming compared to tag based data collection.
Ignorance is bliss
If you’re analyzing log files, this becomes more difficult. You have a lot more information, but how to see which visitors are real and which ones are not?
Until now, log file based analytics has mostly relied on bot detection via the information in the user agent. This catches most of the “legitimate bots” which are left out of most reports by default.
Best of both worlds
In terms of bot detection, you’re now guilty until proven innocent, not the other way around.
Most reports in Logaholic display only “human” data by default, but can be switched to “all traffic” to see what the creepy crawlers are up to.
This way you have useful reports from the marketing perspective, but also from the security and IT perspective.
All new Logaholic profiles using log files will be automatically set to Behavior Based detection. Existing profiles can be switched manually – but prepare for a massive drop in visitors … and don’t shoot the messenger 🙂