View Cloaked Content on a Hacked Website

Updated: September 25, 2012

Hackers can use cloaking to hide the content, usually spam links or spam pages, which they have inserted into your web pages. When a web page is served to a web browser the content is not included, but when the crawler for a search engine requests the page the content is included. In most cases the hacker uses the user agent that is sent when the web page is requested, to determine which version of the page to send. A more advanced method checks what IP address is requesting the page to determine which version of the web page to serve. There are a several tools that you can use to see the cloaked content.

You can compare what is being served to Google's crawler, Googlebot, and to a web browser using a tool we have created.

The User Agent Switcher add-on for the Firefox web browser allows you change the user agent of the browser to the user agent for Google, Yahoo, or Microsoft's crawler. After adding the application to Firefox, you can change the user agent from the Default User Agent item in the Tools menu in the browser. You can then see the rendered version of the web page that is served to the search engines crawler.

To view exactly what your website is serving to Google, you can use the Fetch as Google feature in Google's Webmaster Tools. If you have not already signed up with Webmaster Tools you will need to create an account and verify your website. In Webmaster Tools select your website, click the Fetch as Googlebot link in the Health menu, enter the address of the page you wish to see, and then click Fetch. After a few minutes you will be able to view the page by clicking the Success link.

Related:

Services

Resources