Here is a picture of the notification as received:
If you received this notification, follow the instructions in the email to diagnosis the issue and/or use the fetch and render tool within the Search Console to see what Google sees and what are blocked by your robots file.
As far as I can tell, all the webmasters received this warning this morning. It is beloved that, this JS and CSS notification was sent out to a huge number of webmasters. I have received the same for 10 of my client websites where I have blocked there JS & CSS files.
But do not panic, it is not a lot of things to do on your website to fix it. Just fix the robots.txt file on your and everything would go fine as soon as you allow the modify the robots file to crawl the files.
User-Agent: Googlebot Allow: .js Allow: .css
Once you have this, you can test your setup in Search Console – Blocked resources feature.
John Mueller of Google said this in a comment on his own post on Google Plus saying “We’re looking for local, embedded, blocked JS & CSS. So it would be for URLs that you can “allow” in your robots.txt.”
Identify and Fix Indexing Issues on Your Website through webmaster Console
Googlebot Crawling Doesn’t Bog Down Your Website, It Rather Improves It
How It can Impact your Website Ranking
If you are thinking Google has got bullish, you might need to change the perception. Since, Googlebot cannot access your website content, it cannot render or index your pages and eventually your website will have suboptimal ranking. In order to get better ranking, Googlebot needs full access to your contents so that it can judge its quality and see if its user friendly or not.