Download the SaaS SEO Guide PDF

Over 60+ page SaaS SEO Guide in PDF format so you can read it whenever you want!

SaaS SEO Guide
SimpleTigerSimpleTiger
Technical SEO
·
8/3/2015
·
 min read

What To Do When Googlebot Can't Access CSS/JS Files On Your Site

Get unique insights in your inbox
What To Do When Googlebot Can't Access CSS/JS Files On Your SiteTechnical SEO
Table of Contents
Schedule a free demo

If you received a notice this week from Google Webmaster Tools that Googlebot can’t access CSS/JS (javascript) files on your site, then rest assured you’re not alone. A lot of website owners reported receiving this notification, as noted on sites like SERoundtable.com and Searchenginejournal.com, among others.

Many panicked when receiving the message, perhaps assuming that this meant that their site had a serious issue to correct or was in trouble with Google. After receiving notifications from Google for my own sites as well as some of our clients, I decided to investigate the issue further.

This Isn’t A New Problem

As noted by seroundtable.com, Google has recommended for a long time that web masters do not block CSS/Javascript from Googlebot. In fact, Matt Cutts, the head of Google’s web spam team and an SEO expert, told web masters this back in 2012.

Additionally, the webmaster guidelines over at Google clearly state “to help Google fully understand your site's contents, allow all of your site's assets, such as CSS and JavaScript files, to be crawled."It appears, however, that Google has just started sending out notifications to webmasters this week to let them know that their sites are currently blocking Googlebot from accessing the CSS/JS files.

Why Google Needs To Crawl Your CSS/Javascript Files

For years, webmasters have been told to not allow Googlebot to crawl their CSS/Javascript files. After all, these files are just site assets and not the content, images, or other important SEO components of the site. Due to this, many webmasters added lines to their robots.txt file to tell Googlebot not to crawl their CSS/Javascript files. For a while, Google wasn’t very vocal about why webmasters should not do this, so many assumed that this was an OK or even “good” practice.

However, as noted above, Google has been more and more firm about why you need to allow Googlebot to crawl these files. Back in October, 2014, Google’s Pierre Far said, "Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings."

Translation: Google needs to “see” these files in order to properly render webpages and accurately rank them. If Google can’t “see” these Javascript/CSS files, then your site might not rank as well as it should.

My Site Got The Notice—Is My Site In Trouble?

As mentioned above, many webmasters did panic upon receiving this notice from Google. Luckily, this isn’t an issue to panic over—as of right now, Google does not appear to be actively punishing sites that block CSS/Javascript files from being crawled.

That’s not to say Google won’t punish sites for not adhering to their guideline and allowing the crawl of JS/CSS files. Rather, it appears that Google is simply warning webmasters that their sites aren’t complying and that, if they continue to not comply, they may see lower rankings. In other words: get your site compliant BEFORE Google starts punishing you.

How Can I Fix This?

How you resolve this issue will depend on your site’s platform, though generally speaking, it will require some editing of the robots.txt file for your site. For instance, sites built on the WordPress platform that receive this warning from Google tend to have their /wp-includes/ folder blocked by robots.txt. In this case, you’ll need to edit the robots.txt file to remove the line blocking /wp-includes/. The robots.txt file is usually in the root directory of your website.

When looking at your robots.txt file, look for any of the following lines of code:

Disallow: /.js$*

Disallow: /.inc$*

Disallow: /.css$*
Disallow: /.php$*

Or anything that mentions Disallow: /wp-includes/, /wp-admin/ or similar.

Then simply remove those lines, save the robots.txt file, and upload the new file to your site.Another reported solution is to add the following lines to the robots.txt file:

#Googlebot
User-agent: Googlebot
Allow: *.css
Allow: *.js# global

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/Then save the file and upload it to your site.

From there, run the Google Fetch and Render Tool here. The tool should return no errors, which signals that Googlebot can fully access your site. If the tool does return an error, simply follow Google's instructions for resolving it.

It's Not Affecting You Yet, But Take Action Now

As with many Google SEO warnings, it's important to resolve the problem quickly to avoid any negative consequences—such as a drop in rankings. Get your site compliant now by editing your robots.txt file as needed. As always, though, if you’re stuck or just have questions, SimpleTiger is here to help.

FAQs

No items found.

Takeaways

SimpleTiger
SimpleTiger
Sean Smith
Sean Smith
COO

Sean is Chief Operating Officer at SimpleTiger, responsible for operations, process creation, team utilization and growth, as well as sometimes direct client consultation.

Learn More
Download our SEO for SaaS guide for free!
Download PDF
SimpleTigerSimpleTiger

Ready to get started?

Schedule a Discovery Call and see how we've helped hundreds of SaaS companies grow!

TestimonialsTestimonials
Schedule a Free Demo
Or learn more about our pricing.
SimpleTiger
SimpleTiger

Learn more about Technical SEO

Actionable insights to help you grow your SaaS and dominate your search market!

No items found.

Get The Latest Content Straight In Your Inbox!

SimpleTiger
SimpleTiger

SaaS SEO Guide Call to Action

Over 60+ pages detailing how to grow your SaaS company using a proven SEO process.

SimpleTiger
SimpleTiger