Google Panda 4.0: Is Blocking JavaScript And CSS Hurting Rankings?

Updated on

Since Google Inc (NASDAQ:GOOGL) (NASDAQ:GOOG) released its latest Panda update last month, rankings and page views of many websites have been hurt. Reports suggest that eBay Inc (NASDAQ:EBAY)’s organic search visibility plunged by 80% after the algorithm update. PRWeb’s SEO visibility dropped by 71%. The search engine giant frequently updates its algorithm to keep returning high quality results to users’ search queries. Panda 4.0 update was aimed at weeding out low quality content.

Google introduced a new feature in Webmaster Tools

Google Inc (NASDAQ:GOOGL) (NASDAQ:GOOG) has once again punished self-serving blog posts, product pages with too many links and not much of original content. On the other hand, the Mountain View-based company’s latest tweaks put more emphasis on authoritative websites. Anyway, Yoast creator Joost de Valk has found that many sites that were hit hard by Google’s Panda 4.0 update returned to their normal rankings within a week after unblocking the CSS and JavaScript.

Joost noticed that Google Inc (NASDAQ:GOOGL) (NASDAQ:GOOG) introduced a new feature on the fetch in its Webmaster Tools: fetch and render. This couldn’t be a coincidence as the feature was introduced just a week after the search engine giant rolled out its Panda 4.0 update. If the search engine is unable to render the CSS and JavaScript, it won’t be able to determine where the ads on your web page are. Blocking CSS and JavaScript makes it impossible for Google to identify the position of ads.

Google recommends you not to block CSS and JavaScript

The company has page layout algorithms to determine how many ads you have, especially the number of ads above the fold. If they are too many, it’s going to hurt a website’s ranking. Previously, many website owners blocked CSS to get away from this issue, instead of solving the real problem. So, the timing of Panda 4.0 update and the fetch and render introduction was no coincidence, says Joost.

Google Inc (NASDAQ:GOOGL) (NASDAQ:GOOG) has long recommended you don’t block CSS and JavaScript. The company emphasized this at the SMX Advanced a few weeks ago. Google recommends websites to make sure that Googlebot can access all embedded resources that contribute to the site’s layout and visible content.

UPDATE (9:19 AM EDT, June 23): Besides Joost, hardly any other site has reported similar improvement in ranking. While experts continue to discuss the Panda 4.0 issues, Google Inc employee John Mueller responded to a query by a webmaster on Google Webmaster Help. Here is his response:

“Allowing crawling of JavaScript and CSS makes it a lot easier for us to recognize your site’s content and to give your site the credit that it deserves for that content. For example, if you’re pulling in content via AJAX/JSON feeds, that would be invisible to us if you disallowed crawling of your JavaScript. Similarly, if you’re using CSS to handle a responsive design that works fantastically on smartphones, we wouldn’t be able to recognize that if the CSS were disallowed from crawling. This is why we make the recommendation to allow crawling of URLs that significantly affect the layout or content of a page. I’m not sure which JavaScript snippet you’re referring to, but it sounds like it’s not the kind that would be visible at all. If you’re seeing issues, they would be unrelated to that piece of JavaScript being blocked from crawling.

Cheers John”

Leave a Comment