Here are a couple things you need to know.
The message in Search Console is new, but Google’s initiative isn’t.
Google wants access to crawl EVERYTHING. The message is for those sites where Google can’t crawl everything efficiently or accurately.
The message says that this issue “can result in suboptimal rankings”.
This message would appear as a low impact as a search ranking factor. As the ranking factor itself is not new, there should be no drastic decrease on your account.
If the warning is received, fixing the problem is fairly easy.
If you manage your website account and are comfortable making edits to your robots.txt file, go ahead with the steps below. If you’re not comfortable making changes to your website contact your site’s webmaster for an update. For client’s of Boostability, your Account Manager can address any concerns you have or changes needed.
Steps from Search Engine Land on how to fix your robots.txt file:
Look through the robots.txt file for any of the following lines of code:
If you see any of those lines, remove them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.
After these steps are completed, you’ll want to run your site through Google’s Fetch and Render tool to confirm the problem is fixed. If you are still experiencing problems, your Fetch and Render tool will provide further instructions on changes that need to be made.
In addition to Google’s Fetch tool, you can use the robots.txt tool in your Search Console to identify any remaining issues in crawling your website.
Comments are closed.