Search engine cloaking, it’s the intent that matters

Back at the end of 2005 I wrote a short post titled Cloaking, no need to be ashamed and now in 2007 even more big sites are practicing some form of search engine crawler targeted cloaking. Yet still most SEO’s will give you a blanket answer and tell you to avoid cloaking so you don’t get delisted. I take a more pragmatic view and experience has taught me that certain forms of cloaking can be good!

Cloaking all comes down to intent and to that end I’d like to illustrate a few legitimate (in my opinion) forms of cloaking in the real world:

1. set your user agent to Googlebot and you won’t see a search box anymore. Crawlers don’t do forms and Amazon saves some bandwidth.
2. almost every link is rewritten to go through a click tracking redirect but not so for crawlers. Yahoo doesn’t count the clicks from crawlers which is good for their metrics and the sites they link to get non redirected inbound links.
3. crawlers get the real home page without having to see a full page ad first. I think I’ll start surfing Salon as Googlebot from now on 😉
4., MSN, Myspace, and crawlers don’t get served ad serving code saving bandwidth.

In all of the above cases these sites are serving adjusted content targeted to search engine user agents and generally it’s for the good. In the end the decision to cloak or not all comes down to good judgment and intent. If it helps crawlers avoid duplicate content or saves bandwidth then we shouldn’t be so quick to write off cloaking just because it has such a negative connotation. That said, if there’s any doubt in your mind be very weary and seek expert opinions because getting delisted for bad cloaking can result in a lot of lost revenue!

This entry was posted in Search Engine Optimization, Web. Bookmark the permalink.