Bots are busy little bees on the Internet, and the world of mobile computing may be their next frontier.
According to new research from Distil Networks, 2014 was the first year that bots masking themselves as mobile web users began arriving in droves and a mobile carrier [T-Mobile USA] appeared on the list of the top 20 Internet Service Providers serving bad bot traffic.
Overall, bad bots were responsible for more than eight percent of mobile web traffic, according to the report.
“Bots tend to follow the trends in real usage by a lag of six to twelve months,” said Rami Essaid, CEO and co-founder of Distil Networks. “For example, when Chrome started overtaking other web browsers, we saw the bots follow that trend…reporting themselves as Chrome. Now we’re seeing the same thing with mobile. We’re seeing bots running on mobile networks, as malware on end user devices, as well as farms of cheap mobile android devices. Also, mobile sites tend to be easier to scrape because they provide the bots with more structured access to data.”
While good bots such as search engine crawlers can benefit a site, bad bots are key culprits behind everything from unauthorized vulnerability scans to brute force attacks, according to Essaid.
“A bot is a tool; a mechanism by which to automate an action,” Essaid told SecurityWeek. “Hackers use it to break into accounts. Competitors use it for competitive data mining. The minimum threshold for a bad bot would be one that provides no value to the host. Kind of like a parasite. If you don’t provide value back to the site then it’s a bad bot. For example, if you had a retail store and people were secret shopping you, doing competitive intelligence, with no intent of buy anything, then you have a right to ask them to leave. “
The dataset in the report covers the 23 billion bad bot threats the company observed in 2014 as well as good bot and human traffic. The dataset resides in Distil’s Hadoop Cluster and includes data from hundreds of customers as well as Distil’s global network of 17 data centers, according to the company.
Overall, bots made up 59 percent of all web traffic in 2014, with roughly 22 percent of that traffic coming from bad bots. That percentage is actually a drop from 2013, when 24.22 percent of all web traffic could be traced to bad bots.
In 2014, 41 percent of bad bots attempted to enter a website’s infrastructure disguised as legitimate human traffic. Twenty-three percent of them were categorized as ‘highly sophisticated’ and were immune to bot detection methods found in many Web application firewalls. Seven percent of bad bots disguised themselves as good bots such as Googlebot and Bingbot.
“Webmasters allow entry of the Googlebot to their website infrastructure for SEO purposes,” according to the report. “When a bad bot masked as the Googlebot enters a site, it can cause a wide range [of] problems without raising any alarms.”
The research can be read here.