How to Use Robots.txt For Your Proxy Web-sites

If you are operating a absolutely free internet proxy and do not use a robots.txt, you may perhaps locate problems coming your way from other angry webmasters claiming that you have stolen their internet content material. If you do not comprehend this, then at least remember this term “proxy hijacking” nicely. vpi see, when a proxy user utilizes your cost-free internet proxy is utilized to retrieve a different website’s contents, these content are becoming rewritten by the proxy script and appear to be hosted on your proxy web site automatically. What applied to be on other internet sites now becomes your content following some proxy users visited these third celebration web sites.

Subsequent, you have search engine bots from Google,Yahoo and MSN etc crawling by means of your proxy internet websites content and indexing these automatically made or so called stolen content material and associating those content material to your proxy web-site. When the actual owners and authors of those content do a search on search engines and obtain those content material being listed on your internet proxy (and not on their own web-sites), they turn angry and commence issuing abuse emails to your hosting provider and to the search engines. Your proxy web page will finish up being removed from the search engine outcomes and that may perhaps imply a terrific loss of internet traffic and profits for you.

Some hosting corporations will also suspend your hosting accounts while this is not likely for specialized proxy hosting providers that are applied to handling such complaints and know that the genuine result in of the proclaimed abuses. If you are applying AdSense or any other marketing networks for monetizing your internet proxy, these complainers may well even go as far as to try and get your AdSense accounts banned by report that you are a spammer that is utilizing duplicate content material.

If you do not know what web proxy scripts you are making use of but you know you got them totally free, then most likely you are employing either of the three big proxy scripts: CGI Proxy, Phproxy and Glype. For comfort, we provide a sample robots.txt that performs with their default installations:

User-agent: *

Disallow: /browse.php

Disallow: /nph-proxy.pl/

Disallow: /nph-proxy.cgi/

Disallow: /index.php?q*

Copy the above supply code into a robots.txt and upload it to the root directory for every single proxy website. Building appropriate robots.txt files for your proxy web-sites is an often forgotten but crucial step for numerous proxy owners, especially those that personal huge proxy networks consisting of hundreds of internet proxies.

We are sharing all the tiny stuffs we picked up although operating a lucrative proxy network of 800+ web proxy servers. Click over to our tiny cost-free proxy internet sites to study extra and join our ventures. We have nothing to sell, but you may perhaps get a headache even though as we unload tons of insider data. Much more operate for you in all probability to increase your proxy business enterprise for newbies.