Historical Best Practice
First, some background on how websites inform Google about how to crawl the information to create the best result for both the site and Google.
What Does this Mean?
Google is now asking webmasters to allow Googlebot access to JS and CSS files so that they can render the pages. Site owners should do the following:
- Update their robots.txt files to allow access
- Verify all sites in Google Webmaster Tools
Otherwise, most sites will be good to go.
One caveat is that the servers need to be configured so that crawlers hitting the JS and CSS files do not trigger any firewall rules and reduce access.
Another issue to remember is that Google has been focusing a lot on mobile access and they strongly prefer responsive designs so that they can use the same content to index both the desktop and mobile sites.