Internet Marketing and Web Development Resources
Home Site Map About Contact

Cloaking and IP Delivery (Black Hat SEO)

Table of Contents


IP Delivery for Search Engines


Return to Search Engine Optimization Resources Home

This is Gray-Hat or Black-Hat SEO. IP Delivery to provide different content to individual search engine spiders than that presented to actual users is called "cloaking" and it is against most search engine's webmaster guidelines.

The result could be a complete ban of the site that utilizes that method ("burned to the ground" or "torched" by the search engines are the terms used among SEO circles), WHEN detected. It is not a question of IF really, but WHEN. The question is what the consequences are when it gets discovered.

There are a few instances where this type of IP Delivery is okay and tolerated by the search engines (without admitting it).

There are practical real-world and real-business scenarios where this option is the only option available. As long as it is not used in ANY way to deceive the user and/or the search engines, most will probably not even notice and the few that do are less likely to take action against your site, if no harm was done.

Once you push the limits, chances are it will be noticed and your site or your client's site will get into trouble. The temptation is great to mess around with it a little bit at a time in a direction that will eventually cross the tolerance threshold of what is okay to what is simply "spam" (no matter how much you try spinning it).

IP Delivery Solutions from Beyond Engineering (paid subscriptions) is the most respected provider of up-to-date IP address information of search engine spiders. Their reputation is that they will know about a new spider with new IP address within minutes after it was launched to spider the web. I cannot attest to that, because I don't have a subscription with them myself.

- top -


User-Agent based Cloaking


Return to Search Engine Optimization Resources Home

"Poor Mans Cloaking" is done by checking the user-agent of the visitor and then serve different content to visitors with the user-agent of a known search engine spider than to anybody else. See the spiders resources page for resources about user-agents used by search engine spiders.

Example of a possible user-agent value for the Google bot/crawler or spider:

Googlebot/2.1 (+http://www.googlebot.com/bot.html)

The problem is that user-agents can be spoofed easily by anybody with some programming knowledge. There are even Firefox browser plugins available that let you change your user-agent without any programming skills at all. There are also web-based tools available that can detect if a page uses user-agent cloaking or not. This means that your competitor can find out easily if you engage in cloaking or not and report your site to the search engines in an attempt to get your site banned.

Cloaking Detecter is a free online tool that is capable to detect if a website is using user-agent cloaking. It cannot detect cloaking via IP delivery.

You might want to check the resources to IP Databases and GEO Targeting for search marketing and webmasters, if you want to deliver different content to actual human visitors based on location (country, region/state, city, metrocode/ZIP code), organization, ISP or other criteria.

As I said earlier, there are a few legitimate instances where cloaking should be okay and not trigger a penalty by the search engines if they discover it. I would still suggest to keep a low profile and not make it public to avoid any type of possible issues or debates that could arise from it. I also recommend to make sure that cloaked pages will not be cached by search engines. This is done via META elements. This ensures that the cloaked page for the search engines can not be seen by users who click on the "cache" link for that page in the Google search results pages.

For example:

<META NAME="GOOGLEBOT" CONTENT="NOSNIPPET"> 
<META NAME="GOOGLEBOT" CONTENT="NOARCHIVE">

This prevents Google from archiving the current page (see post at the Google Blog). It also tells Google not to create any snippet based on the page content. The NOSNIPPET tag automatically triggers the NOARCHIVE for Google, but it does not hurt to include the tag anyway. Also the "cache" link will not be shown for your page in the Google search results pages.


- top -


Cumbrowski.com Sponsors

See the Advertiser Kit to learn more about sponsorship opportunities at Cumbrowski.com. Press? Download my Media Kit.

Email Alert & Newsletter (privacy) My Blog Posts and Newsletter (read)


Enter your email address:

or ReveNews - Carsten Cumbrowski - Feed