Don’t block CSS and JS files

In 2015, Google Search Console already started to actively warn webmasters not to block CSS and JS files. In 2014, we told you the same thing: don’t block CSS and JS files. We feel the need to repeat this message now. In this post, we’ll explain why you shouldn’t block these specific files from Googlebot.

Why you shouldn’t block CSS and JS files

You shouldn’t block CSS and JS files because that way, you’re preventing Google to check if your website works properly. If you block CSS and JS files in yourrobots.txt file, Google can’t render your website like intended. This, in return, makes that Google won’t understand your website to the fullest and might even result in lower rankings. What’s more, even tools like Ahrefs started rendering web pages and executing JavaScript. So, don’t block JavaScript if you want your favorite SEO tools to work.

I think this aligns perfectly with the general assumption that Google has gotten more and more ‘human’. Google simply wants to see your website like a human visitor would, so it can distinguish the main elements from the ‘extras’. Google wants to know if JavaScript is enhancing the user experience or ruining it.

Test and fix

Google guides webmasters in this, for instance in the blocked resources check in Google Search Console:

Search Console - Blocked Resources example | Don't block CSS and JS files

Besides that, Google Search Console allows you to test any files against yourrobots.txt settings at Crawl > Robots.txt tester:

Search Console robots.txt tester | Don't block CSS and JS files

The tester will tell you what file is and isn’t allowed according to your robots.txt file. More on these crawl tools in Google Search Console here.

Unblocking these blocked resources basically comes down to changing your robots.txt file. You need to set that file up in such a way that it doesn’t disallow Google to access your site’s CSS and JS files anymore. If you’re on WordPress and use Yoast SEO, this can be done directly in our Yoast SEO plugin.

WordPress and blocking CSS and JS files in robots.txt

To be honest, we don’t think you should block anything in your robots.txt file unless it’s for a very specific reason. That means you have to know what you’re doing. In WordPress, you can go without blocking anything in most cases. We frequently see /wp-admin/ disallowed in robots.txt files, but this will, in most cases, also prevent Google from reaching some files. There is no need to disallow that directory, as Joost explained in this post.

We’ll say it again

We’ve said it before and we’ll say it again: don’t block Googlebot from accessing your CSS and JS files. These files allow Google to decently render your website and get an idea of what it looks like. If they don’t know what it looks like, they won’t trust it, which won’t help your rankings.

Read more: robots.txt: the ultimate guide »

Coming up next!

4 Responses to Don’t block CSS and JS files

  1. Aditya Singh
    Aditya Singh  • 6 years ago

    Well, I have read in many technical blogs who always show ways to block JS and CSS. But no one have specified that it can harm our blogs.

  2. Jason Belec
    Jason Belec  • 6 years ago

    Google doesn’t render your website. Your browser does. And people like me don’t care if Google can’t force us and those around us to work in a manner that primarily allows Google to dominate. This article is verbatim, the old Microsoft nonsense to attempt total control. We all know how that goes.

    • Nicholas M
      Nicholas M  • 6 years ago

      How else do you think big G knows if your site is responsive?

    • garth
      garth  • 6 years ago

      actually Jason, as the article points out, Google does render your website.