How To Respond To Google’s Latest Warning – “Googlebot Cannot Access Your Javascript and CSS Files”

How To Respond To Google's Latest Warning - "Googlebot Cannot Access Your Javascript and CSS Files"

How To Respond To Google’s Latest Warning – “Googlebot Cannot Access Your Javascript and CSS Files”

You may have noticed an increase in messages from Google Search Console stating that the Googlebot cannot access CSS and JS files for a given URL. The message is a result of Google’s recent initiative to provide greater transparency in search ranking factors.  The message has sparked several warnings throughout website’s Search Console platforms.  While this warning message is new, the subject was first implemented last October into Google’s Webmaster Guidelines.
Below is a picture example of what the message looks like:
Google Search Console Warning

Here are a couple things you need to know.

The message in Search Console is new, but Google’s initiative isn’t.

Twitter bird icon Google wants access to crawl EVERYTHING. The message is for those sites where Google can’t crawl everything efficiently or accurately.

The message says that this issue “can result in suboptimal rankings”.

This message would appear as a low impact as a search ranking factor.  As the ranking factor itself is not new, there should be no drastic decrease on your account.

If the warning is received, fixing the problem is fairly easy.

Twitter bird icon If you manage your website account and are comfortable making edits to your robots.txt file, go ahead with the steps below.  If you’re not comfortable making changes to your website contact your site’s webmaster for an update. For client’s of Boostability, your Account Manager can address any concerns you have or changes needed.

Steps from Search Engine Land on how to fix your robots.txt file: Twitter bird icon

Look through the robots.txt file for any of the following lines of code:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*

Disallow: /.php$*

If you see any of those lines, remove them. That’s what’s blocking Googlebot from crawling the files it needs to render your site as other users can see it.

After these steps are completed, you’ll want to run your site through Google’s Fetch and Render tool to confirm the problem is fixed.  If you are still experiencing problems, your Fetch and Render tool will provide further instructions on changes that need to be made.

In addition to Google’s Fetch tool, you can use the robots.txt tool in your Search Console to identify any remaining issues in crawling your website.

 

Andrew Eagar
[email protected]

The Director of SEO Strategies for Boostability, Andrew is an internet marketing enthusiast with several years of SEO Experience. Helping small businesses find success online is passion of his, as is spending time with his family and watching Parks & Recreation reruns.