Don’t block CSS and JS files

In 2015, Google Search Console already started to actively warn webmasters not to block CSS and JS files. In 2014, we told you the same thing: don’t block CSS and JS files. We feel the need to repeat this message now. In this post, we’ll explain why you shouldn’t block these specific files from Googlebot.
Why you shouldn’t block CSS and JS files
You shouldn’t block CSS and JS files because that way, you’re preventing Google to check if your website works properly. If you block CSS and JS files in yourrobots.txt
file, Google can’t render your website like intended. This, in return, makes that Google won’t understand your website to the fullest and might even result in lower rankings. What’s more, even tools like Ahrefs started rendering web pages and executing JavaScript. So, don’t block JavaScript if you want your favorite SEO tools to work.
I think this aligns perfectly with the general assumption that Google has gotten more and more ‘human’. Google simply wants to see your website like a human visitor would, so it can distinguish the main elements from the ‘extras’. Google wants to know if JavaScript is enhancing the user experience or ruining it.
Test and fix
Google guides webmasters in this, for instance in the blocked resources check in Google Search Console:

Besides that, Google Search Console allows you to test any files against yourrobots.txt
settings at Crawl > Robots.txt tester:

The tester will tell you what file is and isn’t allowed according to your robots.txt
file. More on these crawl tools in Google Search Console here.
Unblocking these blocked resources basically comes down to changing your robots.txt
file. You need to set that file up in such a way that it doesn’t disallow Google to access your site’s CSS and JS files anymore. If you’re on WordPress and use Yoast SEO, this can be done directly in our Yoast SEO plugin.
WordPress and blocking CSS and JS files in robots.txt
To be honest, we don’t think you should block anything in your robots.txt
file unless it’s for a very specific reason. That means you have to know what you’re doing. In WordPress, you can go without blocking anything in most cases. We frequently see /wp-admin/
disallowed in robots.txt
files, but this will, in most cases, also prevent Google from reaching some files. There is no need to disallow that directory, as Joost explained in this post.
We’ll say it again
We’ve said it before and we’ll say it again: don’t block Googlebot from accessing your CSS and JS files. These files allow Google to decently render your website and get an idea of what it looks like. If they don’t know what it looks like, they won’t trust it, which won’t help your rankings.
Read more: robots.txt: the ultimate guide »
Coming up next!
-
WordCamp Irun 2022
May 21 - 22, 2022 Team Yoast is sponsoring WordCamp Irun 2022, click through to see if we'll be there, who will be there and more! See where you can find us next » -
How to get rich results for your online store
24 May 2022 In this FREE webinar, you'll learn all about rich results. Search results that help you stand out in the Google search results and drive more customers. All Yoast SEO webinars »
Well, I have read in many technical blogs who always show ways to block JS and CSS. But no one have specified that it can harm our blogs.
Google doesn’t render your website. Your browser does. And people like me don’t care if Google can’t force us and those around us to work in a manner that primarily allows Google to dominate. This article is verbatim, the old Microsoft nonsense to attempt total control. We all know how that goes.
How else do you think big G knows if your site is responsive?
actually Jason, as the article points out, Google does render your website.