In 2015, Google Search Console already started to actively warn webmasters not to block CSS and JS files. In 2014, we told you the same thing: don’t block CSS and JS files. We feel the need to repeat this message now. In this post, we’ll explain why you shouldn’t block these specific files from Googlebot.
Why you shouldn’t block CSS and JS files
You shouldn’t block CSS and JS files because that way, you’re preventing Google to check if your website works properly. If you block CSS and JS files in your
Test and fix
Google guides webmasters in this, for instance in the blocked resources check in Google Search Console:
Besides that, Google Search Console allows you to test any files against your
robots.txt settings at Crawl > Robots.txt tester:
The tester will tell you what file is and isn’t allowed according to your
robots.txt file. More on these crawl tools in Google Search Console here.
Unblocking these blocked resources basically comes down to changing your
robots.txt file. You need to set that file up in such a way that it doesn’t disallow Google to access to your site’s CSS and JS files anymore. If you’re on WordPress and use Yoast SEO, this can be done directly in our Yoast SEO plugin.
WordPress and blocking CSS and JS files in
To be honest, we don’t think you should block anything in your
robots.txt file unless it’s for a very specific reason. That means you have to know what you’re doing. In WordPress, you can go without blocking anything in most cases. We frequently see
/wp-admin/ disallowed in
robots.txt files, but this will, in most cases, also prevent Google from reaching some files. There is no need to disallow that directory, as Joost explained in this post.
We’ll say it again
We’ve said it before and we’ll say it again: don’t block Googlebot from accessing your CSS and JS files. These files allow Google to decently render your website and get an idea of what it looks like. If they don’t know what it looks like, they won’t trust it, which won’t help your rankings.