Google against disallowing Googlebot from crawling sites CSS or JS files

According to the blog post title “Deprecating our AJAX crawling scheme” from Google Webmaster Central Blog, it seems that Googlebot is currently able to render and understand your web pages like modern browsers. What does that mean actually? Honestly I have no idea. I do know that it has something to do with page index, web crawler and page rank. In other words Google has made some changes on their search engine which might affect your blog or website page rank. How does it affect? Well, nobody knows for sure. We just have to wait and see the result.

What I do know is that things which were unable to perform years ago are able to execute now. It is consider an update or improvement for Google web crawler. I suppose it won’t affect much if you focus mainly on producing good quality content. You might have some trouble if you play around with settings which determine which areas allowed by web crawlers.

Keep an eye on your page rank just in case. Your site might need some tuning if necessary.

Comments

comments

Both comments and pings are currently closed.

Comments are closed.