See if this situation sounds familiar. You’re going through your website analytics and see a spike in traffic one day. Great news! So you try to figure out the why so you can replicate it. Did you post a new blog that day? Send an email? Maybe someone else linked back to your website? Then you look into it a little further and realize you got hit by a bunch of bot traffic for an hour at 3am. So the analytics look good, but unfortunately it’s not true numbers. 

Some attacks are more malicious. You get a ton of fake leads because bots keep filling out your forms online. Some bots even plant malicious keywords or backlinks onto your site that then tank your rankings and eliminate all the positive growth your website has seen through SEO. 

So what can be done? Bots are hard to track and those that design them do their job very well, they get in and out before you realize what is happening. Read our list here for ways you can protect your website from bot traffic.

 

Update your robots.txt file

The robots.txt file added to your website tells bots how to crawl your website for information. Often used as part of SEO to tell Google or other search engines how to crawl your site and is the first thing spiders will use when they crawl it. Having an incorrect or wrongly-installed robots.txt file can lead to blocked pages that Google will just never crawl.

Granted, spam bots or bots designed to cause havoc on websites won’t necessarily follow the rules outlined in your robots.txt file. But this can easily be your first line of defense to help you manage the bots that crawl your website. 

 

Set up a CAPTCHA or other extra step

Probably one of the best ways to protect your forms and lead gen funnels from malicious bots is a CAPTCHA system. It’s the system that makes you identify pictures of buses or letters behind some lines or a pattern. Beyond what just feels like an annoying extra step as a user, it’s a powerful tool to protect your site from bots. They can’t identify the pictures and letters, so it works as an extra step of protection for your website. You can add it to your website login, into a lead form, or even a newsletter signup. Essentially it should be your first layer of defense against these bots.

 

Carefully monitor where your site traffic comes from

One of the best things you can do is just vigilance. Get to know your website analytics intimately. You should know traditional traffic patterns. Map out the customer journey through your website. Know where they come in and out. Know your keywords and your high performing pages through Google and search engines. Then if you start to notice something unusual, it’s easier to get to work to figure it out. 

If a bad bot does get through, you can usually track it’s progress. You can see how it moved from page to page by the spikes and different times. If you can’t block it in its tracks for whatever reason, you can learn from its behavior and identify vulnerabilities in your website. This can help you to block similar bots in the future. 

 

Blacklist or Limit IP Addresses

An extreme way to limit or stop bot traffic is to completely block or limit access from places where bots are common. So for certain sites known to have malicious traffic, or places where you’ve had bot attacks before, you can block them from coming again. Those who create these malicious bots are agile and can maneuver around these kinds of blocks. But it’s a start. Rate limiting can also help you curb some malicious bots coming from one IP address. That can slow the attack if a lot of bots are coming all at once. 

 

Keep Your Plugins Updated

Most websites have dozens of plugins to help customize what you want on your site. They can help functionality, or add something that your website theme doesn’t normally have. Outside developers normally create these then update the functionality over time. But these updates also can improve security. Hackers are smart. They look for vulnerabilities, especially with popular apps and plugins. So the developers create patches and fixes, then release them as updates. Stay on top of these to keep your site secure so if there is a bot that figured out a workaround, your site won’t be affected.

Share:

administrator

Kristine is the Director of Marketing at Boostability. She brings a decade's worth of communications strategy work to the company. Kristine has a Masters Degree in Leadership and Communications from Gonzaga University and graduated from BYU with her undergrad in Broadcast Journalism. She's worked in television news, public relations, communications strategy, and marketing for over 10 years. In addition to being a part of the marketing team, Kristine enjoys traveling, sports, and all things nerdy.