Cloaking, the process of showing a user one view of your page and a
search engine crawler another view of your page is (I believe) a
fairly common practice that’s been hotly debated in the past and
discouraged by search engines.
In reality I’ve seen sites use cloaking to strip navigation elements,
hide form data, disable multi-page results, etc… to increase search
engine ranking while at the same time decreasing page size for crawlers
and decreasing the number of pages that need to be crawled. Given that
crawlers represent over a 3rd of page views for many high traffic sites,
cloaking has its economic advantages as well.
In my opinion there is nothing wrong with cloaking and it can be
used effectively as long as it’s not abused and the basic underlying
content and concept of a page stay the same. It’s a bad thing when used
to artificially increase keyword density or misrepresent the content of the
page to simply get page views.
I’ve always been a little hesitant to talk about effective cloaking
techniques with colleagues at other companies though. Cloaking
seems to be a dirty secret that most people feel they shouldn’t talk
about. Just observe how most people who work for a website that
economically depends on search engine traffic will publicly deny
Anyhow, I was surfing around with my user agent
set to GoogleBot to see which other sites are engaging in a little
cloakatude. Low and behold
Amazon behaves differently when you send it a user agent of Google Bot.
No search box in the top nav… makes sense… crawlers don’t use
forms anyhow, why spend money on bandwidth to serve them up a form
they’ll just ignore.
I found other major sites as well that were doing similar cloaking.
bothering to serve ad code to known crawlers, removing unnecessary
content, etc… Anyhow it just goes to show that some degree of
cloaking seems to be happening at many major websites. I for one think
it’s fine. Why spend tens of thousands in bandwidth serving up extra
pages or content that’s not relevant to crawlers anyhow?