Crawling Ajax has always been a tricky act that Google finds hard to perform. The difficulty stems from the very nature of this method—JavaScript. Because Ajax uses JavaScript to talk to the web server to render the page as it interacts with the user, Googlebot can easily access all the HTML content.

In 2009, the search giant developed a workaround to get a pre-rendered page version that has the complete HTML after all the JavaScript is executed for parsing and indexing. Historically, though, not everybody had success SEO-wise, but the practice has been the most feasible way to make Ajax pages crawlable.

But then, in October 14, Google told the world to forget about the scheme because they are now “generally able to render and understand” web pages, as long as their spider is not blocked to crawl CSS or JavaScript files. Google’s Ajax crawling scheme depreciated the next day.

Does this mean the technology giant will now crawl, parse, and index such dynamic websites with no trouble? Is this a cue to take advantage of the wonders of Ajax without harming your SEO? Maybe, but not so fast.

Reading Between the Lines

Any self-respecting provider of SEO services wouldn’t consider Google’s announcement as a go-signal to use Ajax for online marketing purposes right away. First, the statement was vague and was obviously worded carefully; it included the words” generally able,” which don’t inspire absolute confidence to webmasters and site owners, given what’s at stake, from an SEO point of view.

Second, Google recommends the principles of progressive enhancement as the replacement guideline. If the leading search engine rendered their own Ajax crawling proposal useless now, then why endorse a method that loads HTML for non-JavaScript browsers?

The announcement certainly sends mixed signals if you put it under a microscope.

How Successful Is Successful?

Back in 2008, news broke out that Google was successful in crawling JavaScript. After more than seven years, the company has seriously made huge strides in honing their mastery of this high-level programming language. But how good are they really now?

There’s proof that Google has been getting better at understanding complex forms of JavaScript, but recent history tells you that such improvement didn’t pay dividends for site owners. Some Ajax users suffered a bitter fate believing that the search engine could crawl their site trouble-free, costing them the majority of their organic traffic—which would take a long while to recover.

Too Big a Risk (For Now)

Snubbing the time-tested guidelines Google proposed (and recently depreciated) might turn your site into a guinea pig. Sure, Google gets smarter every day, but it doesn’t mean they experience no more troubles with Ajax. Even John Mueller confirmed late last October that they’re still having issues with it.

Investing in search engine optimization in Denver, San Diego, Chicago, or any other major market is already a risky business, and it becomes even more to trust Google when it comes to this matter at the moment.

Unless more evidence surface guaranteeing the search engine’s mastery of Ajax crawling and many industry authorities express their validation, it’s best to play it safe for now.

Can Google crawl Ajax sites is not the real question, but rather can Google crawl pure Ajax pages perfectly with consistency. Together, we’ll wait for a more convincing answer.

Bookmark our blog page now, and we’ll keep you posted on the latest SEO news.

Hey there, anything we can assist with?

  • This field is for validation purposes and should be left unchanged.