BotoPedia, a site similar to Wikipedia, but focusing on bots (both good and bad), made its debut today. It’s the brainchild of Incapsula, a cloud-based website security and performance service, and they’re hoping it will serve as a useful resource for organizations who need help developing Web-based policies.
The name is catchy, but it doesn’t roll right off the tongue. That’s ok though, because if it works then it will allow policy makers and administrators the ability to name the bot-based traffic they want hitting their domain. BotoPedia is designed to allow the identification of bots, pure and simple, helping webmasters and corporate operators who need it differentiate the good from the bad.
A recent Incapsula study of 1,000 customer websites discovered that 16.3 percent of sites suffer from Googlebot impersonation attacks of some kind. Among those targeted sites, one out of every five Googlebots were actually impersonators.
“Incapsula has been tracking and mapping thousands of bot types since the service was launched in 2010 and is in constant contact with many of the Bot operators...We decided to make BotoPedia an open project, allowing not only our customers, but also third parties to use and contribute to this active directory; they can do so in conjunction with other tools or practices that keep their sites safe and operating at top speed,” said Gur Shatz, co-founder and CEO of Incapsula.
Currently, there are only 44 bots indexed, across categories including search bots, crawlers, feed fetchers, service agents, site monitors, and social media agents. Submissions are via form, and the hope is to see the index expand before the year is out.
View it online here.