View all newsletters
Receive our newsletter - data, insights and analysis delivered to you

Sending Bots to the Sin Bin: How to Reduce Unwanted Web Traffic

"Bandwidth limiting works in conjunction with monitoring and synchronization"

By CBR Staff Writer

When it comes to connected digital experiences, like websites or applications, responsiveness and scalability are everything, writes Liam Crilly, Director of Product Management at NGINX. But even in the age of cloud elasticity where compute resources can be expanded in real-time to respond to demand, it may be difficult to ensure high performance when human clicks are supplanted by automated bad actors–bots.

bot traffic nginx

Liam Crilly, Director of Product Management at NGINX.

Bots are nothing new to the internet. The first bots showed up around 1988 in the channels of Internet Relay Chat (IRC) and in only a few years became instrumental to the very operation of the internet as search engines employed them to index websites.

In 1995, AOL used WebCrawler and then, in 1996, Google created Googlebot (formally called BackRub). Early bots, though, weren’t always beneficial. Sub7 and Pretty Park, a Trojan and worm respective, were released into IRC in 1999 where their purpose was to secretly install themselves on machines that connected to a specific IRC channel and then listen for commands through that connection.

Just as bot code grew more sophisticated, their application grew more nefarious. Sometimes installed within software or as the application itself (like GTbot), hackers began to connect bots together as “botnets” which could be controlled as a group and directed at specific network resources with the purpose of making them unavailable by flooding them with fake requests. In 2007, one of the biggest botnets, dubbed “Storm”, infected around 50 million computers and was used for a variety of crimes such as stock price fraud and identity theft. And you can’t talk about bots and botnets without touching on the most hated of automated bot activities—spam. In 2009, a botnet called Cutwail was used to send out 74 billion emails per day.

When it comes down to it, though, not all bots are bad, nor are they all good. They are simply clever programs designed to automate repetitive tasks. It’s how they are employed which dictates the impact of their purpose. When they are used to spider websites for the purpose of search engines, they are great. Could you imagine humans having to visit website after website to index all the pages and put them into a database against which they can be searched? Or, using the web without search engines? So, what happens when a bot’s intentions aren’t necessarily nefarious but still cause harm? For example, such as scraping data from webpages. The bot isn’t trying to bring down the website, but it is consuming server resources to the extent that responsiveness and performance are significantly undermined for human users. When a website or app is designed to scale elastically, such as through automatically spinning up resources in a cloud provider like Amazon Web Services, the monetary impact of an out-of-control bot scraping web pages can be devastating.

Internet service providers, content delivery networks, and IT departments have engaged in a variety of actions to curtail the behavior of bots. In many cases, network operators try to detect the bot traffic and stop it, such as by returning 400 responses to bot requests for resources. But many bots won’t be deterred by such an approach and can usually switch IP addresses in real-time to get around these type of network blockades. In short, “resistance is futile” when it comes to bots. They will always find a way to get to the resources they need at the expense of a degraded experience for end users.

Although there are no silver bullets to thwarting bot traffic, we have taken a unique approach—bandwidth limiting. When network operators rate limit bots (that is, only allow a certain number of requests to come from a specific IP), the bots respond by using other vectors to get to the resources they want. Rate limiting is, unfortunately, very easily detected by bots (we help mitigate that some by providing users five different rate-limiting policies to deploy). But with bandwidth limiting, which is much harder to detect, bots are effectively given very little pipe to come through, no matter how many requests they make. In this scenario, when bot traffic is detected, the offending IP is placed into the “sin bin” where the bot can make as many requests as it would like but it receives a much slower response. The British Library is a great example of how NGINX can be used to manage bot traffic. Serving 11 million browser requests per day and up to 7,000 search requests per hour, the British Library knew it needed a solution as spider, crawler, and other bot traffic continued to climb, reaching over 10 percent of total website requests. NGINX provided them a way to mitigate the issue and reduce the impact of bot traffic on the experience of human website visitors.

Content from our partners
Unlocking growth through hybrid cloud: 5 key takeaways
How businesses can safeguard themselves on the cyber frontline
How hackers’ tactics are evolving in an increasingly complex landscape

But that’s not the whole story when it comes to controlling bot traffic. Although bandwidth limiting is a powerful tool, the function works in conjunction with two others—monitoring and synchronization—to provide a multi-layered solution. With monitoring, our users can get a visual look into the “sin bin” by API endpoint to see how many requests and IP addresses have been quarantined. No more culling through logs!

But what makes it truly powerful in the fight against bots is the data synchronization. Quarantined bot traffic on NGINX installations is shared with all other NGINX instances making it a global anti-botnet; a proactive approach to bot traffic mitigation, rather than a reactive one. That makes our service not only a load balancer, reverse proxy, API gateway, and all-around powerful internet serving platform, it’s a very unique solution to a very common and growing problem with bot traffic. Rather than scrambling to solve an unsolvable problem, DevOps and network operations can deploy NGINX, configure bandwidth limiting, and take a proactive approach to ensure their human users get a high-performing and uninterrupted digital experience.

See also: RPA is Driving New Ways to Create and Deploy Bots

Websites in our network
Select and enter your corporate email address Tech Monitor's research, insight and analysis examines the frontiers of digital transformation to help tech leaders navigate the future. Our Changelog newsletter delivers our best work to your inbox every week.
  • CIO
  • CTO
  • CISO
  • CSO
  • CFO
  • CDO
  • CEO
  • Architect Founder
  • MD
  • Director
  • Manager
  • Other
Visit our privacy policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.
THANK YOU