0 Shares

Earlier in 2012, Google warned against websites blocking its access to their CSS and JavaScript files to make it open for GoogleBot.  Here is what Matt Cutt said webmasters not to block the CSS for GoogleBots.

Although, most of the Webmasters ignored it then, But Google’s recent email warning through the webmaster consol could open a new discussion. This time Google is very serious and making sure the message is loud and clear by using notifications via email and the Webmaster Search Console, as blocking the JS and CSS files is not allowing the Googlebot to crawl your websites CSS and JavaScript files to serve its users better results while considering the on the user friendliness of a website as a ranking factor.

Though warning about suboptimal rankings is not new here, but the Search Console notifications are quite important to consider. If you don’t want to lose your ranking, you have to let the Googlebot to access your CSS & JavaScript Files.

Here is a picture of the notification as received:

Google Search Console Warnings Issued For Blocking JavaScript CSS

If you received this notification, follow the instructions in the email to diagnosis the issue and/or use the fetch and render tool within the Search Console to see what Google sees and what are blocked by your robots file.

As far as I can tell, all the webmasters received this warning this morning. It is beloved that, this JS and CSS notification was sent out to a huge number of webmasters. I have received the same for 10 of my client websites where I have blocked there JS & CSS files.

But do not panic, it is not a lot of things to do on your website to fix it. Just fix the robots.txt file on your and everything would go fine as soon as you allow the modify the robots file to crawl the files.

But you must act quickly and here is why you need to do it fast. What you can do just update your robots file with below code to quickly unblock javascript & css assets of your website.

User-Agent: Googlebot
 Allow: .js
 Allow: .css

Once you have this, you can test your setup in Search Console – Blocked resources feature.

John Mueller of Google said this in a comment on his own post on Google Plus saying “We’re looking for local, embedded, blocked JS & CSS. So it would be for URLs that you can “allow” in your robots.txt.”

CSS and JavaScript files will improve interaction with GoogleBots

The reason Google is so determined to crawl your website is to see it like an average user. When you block the css/js access, Googlebot cannot access your website layout and will probably never know if its user friendly. As usual Google wants to improve the quality of the content and services it provides to its users and it only wants to render and index websites that are user-friendly and have quality content. Blocking your CSS and JavaScript files will harm your website chances of getting better rendering and indexing.

Identify and Fix Indexing Issues on Your Website through webmaster Console

Search-Console-robots.txt-Tester
Googlebot is the web crawler used by the Google to access a website’s content. Most of the webmasters think it’s a useless resources and block crawling of their CSS and JavaScript files. In fact many of the CMS have a default block on their include files. The rendering and indexing of your website by the Google will allow you to see how the Googlebot sees your page and that’s exactly how your visitors will see it. This will help you in improving your website content for better accessibility. But, this can only happen when you allow Googlebot to fully understand your websites.

Googlebot Crawling Doesn’t Bog Down Your Website, It Rather Improves It

Most of the webmasters have doubts about Googlebot ability to process their CSS and JavaScript files and fear it will increase the bandwidth consumption and bog down their website. But, the Google has got better at processing and won’t make your website sluggish.

Another doubt that most the webmasters have is that since Google is not apt at processing CSS and JavaScript content, it might misinterpret their content as something malicious and block it. But, Google has got better and the fear of getting penalized because of your CSS and JavaScript content is unnecessary.

How It can Impact your Website Ranking

If you are thinking Google has got bullish, you might need to change the perception. Since, Googlebot cannot access your website content, it cannot render or index your pages and eventually your website will have suboptimal ranking. In order to get better ranking, Googlebot needs full access to your contents so that it can judge its quality and see if its user friendly or not.

Google’s warning about blocking the CSS and JavaScript files might open new doors of improvement. Instead of seeing it as a threat, webmaster needs to allow full access for better user interaction and better ranking. A forum discussion you can find at Stack Overflow and don’t forget to comment here.

0 Shares

LEAVE A REPLY

Please enter your comment!
Please enter your name here