The best ways to incorporate the benefits of AJAX without making your site blind to search engines

The guys at posted quite good article showing how not to use cloaking, div layering:

AJAX gives developers the ability to build dynamic web applications without the need for constant server side script parsing, enabling you to provide users with simulated “load on demand” sites. That means shorter page loading times, a sharp decrease in bandwidth consumption and more accessible information in general.

Unfortunately, developers are often so focused on creating a dynamic web page, that they forget that most website traffic today comes through search engines, search engines which have a hard time crawling and indexing the JavaScript in standard AJAX sites. In plain english, that means that in Google’s eyes, many of your AJAX site’s pages are invisible when it comes to search engine ranking. Compounding the problem is the fact that standard AJAX implementations use only a single page or URL for the majority of actions, meaning that not only are your site’s pages indexed poorly, but that your site also has fewer pages in Google’s index.

To help you as you incorporate the benefits of AJAX without making your site blind to search engines, we’ve assembled a guide of the best (and the worst) solutions for creating an AJAX page that maintains the ability to be indexed by Google. Here is our how to guide on getting Google and AJAX to play nice.

The Wrong Approach

The Little Things

AJAX allows you to incorporate a lot of innovative site design characteristics. Some designers take the AJAX craze too far, however, by incorporating AJAX to a degree that it hurts their site’s usability and accessibility. Here are a few of the most common problems.

* Making it too simple. Designing sites where AJAX controls everything and serves content on a single page can be a search engine ranking disaster, as your website will have only one URL for everything. Instead, be sure to offer unique sub-links and URLs for popular site features.
* Disabling browser controls. Since AJAX does not communicate with your browser’s history, simple actions like hitting your browsers back and forward buttons are rendered useless. Although you may not traverse your website via browser buttons, many users do. Make sure you’re not overdoing AJAX so much that users get lost in your website.
* Not using Google’s Webmaster Tools. These tools are a simply and reliable way to keep track of the pages of your website Google is indexing.


If Google cannot read the JavaScript components of your site, an obvious solution is to just provide an alternative readable version for Google, right? Wrong. Engaging in cloaking is a bad approach for making your AJAX website search engine friendly, because it is likely to get your site blacklisted and removed from Google’s index entirely. Cloaking occurs when a developer creates two distinct versions of the same website, with the second version (usually plain HTML/text) only visible to search engine spiders. Spammers have long used the technique to hide popular phrases and links in the invisible content in order to artifically rank better in search engines, and Google has responded by banning these sites. To make sure Google doesn’t confuse your site for one that’s up to no good, make sure your crawlable site is derived from the same site your visitors are seeing.

DHTML Layering

If you’ve read other articles on making AJAX sites search engine friendly, you’ve probable read that you should create DHTML layers of your content. While that solution might be effective for producing an attractive interface, however, it is absolutely the wrong approach to achieving better search engine rankings, as DHTML (typically) requires JavaScript to function properly. So how could it be possible to fix a JavaScript compatibility problem by throwing more JavaScript on it? It’s not. Still not convinced? Try readying Google’s Webmaster Guidelines and check out this case study.

Abandoning AJAX

Perhaps the easiest (read laziest) solution is to just abandon AJAX altogether in favor of server side scripting languages such as PHP or Perl. While these languages are powerful enough to perform virtually every operation, they lack the same interactive feel that AJAX websites produce. Now we’re not saying you should completely avoid using server side scripting languages; however, for front-end pages requiring a lot of simple interaction with the consumer we’re all for AJAX.

The Right Approach

Getting Started

Here are a few general tips you should follow when developing a search engine friendly AJAX website.

* Design your site with degradable AJAX, that way users with JavaScript disabled can view a working version of your website along with JavaScript enabled visitors.
* After you’ve established a non-AJAX working version of your website, go back and include an alternative AJAX enhancements where you desire.
* When designing, make sure to check your website with JavaScript disabled as well as through the eyes of a text only browser such as Lynx or SEO-Browser.
* Perform a browser check to make sure the user has JavaScript enabled, that way you’re only serving AJAX pages to users that can view them.

It is extremely important to follow this methodology for widely used site features such as navigation bars. After all, if the user is unable to traverse the pages of your website, what is the point of offering content?


Hijax is potentially the most reliable and resourceful solution for developing an AJAX website that properly renders for every user (JavaScript enabled or not) and is also Google spider friendly. To employ the Hijax model, you’ll need to start off by building a standard website equipped with basic links and server side submission techniques. From there, use JavaScript to intercept those links, submissions and other user interactions and pass the information via XMLHttpRequest. Through this process, your website is able to selectively update individual parts of the page in “real time” without the need for constant reloading. Confused yet? We were too at first, but after following this easy guide everything suddenly became clearer.

Excerpt from Progressive Enhancement with Ajax:

JavaScript pseudo-protocol – AWFUL!
contextual help

Pointless link – BAD!
contextual help

Using the DOM – BETTER.
contextual help

No inline JavaScript – BEST!
contextual help


You can generate a degree of search engine recognition for your AJAX webpages by using the noscript element. The noscript element “allows authors to provide alternate content when a script is not executed.” For more information on implementing the noscript technique, check out this informative article.


Now that you have incorporated some of the pro search engine AJAX design recommendations, you can get the ball rolling for your website by submitting a Sitemap index file to Google. Sitemaps allow a “webmaster to inform search engines about URLs on a website that are available for crawling.” Keep in mind simply submitting a sitemap isn’t enough, you’ll need to engange in a combination of the previously mentioned techniques for the best results. For more information check out Sitemaps Wikipedia file.

At this point you should have enough information and resources to build an AJAX enhanced, search engine friendly website. From basic design principles such as incorporating degradable AJAX into your site to utilizing Noscript techniques, you can create an AJAX website that is not only accessible to both JavaScript enabled and disabled users, but also one that will be well-indexed by search engines.
read original

Leave a Reply

Your email address will not be published. Required fields are marked *