Use robots.txt Disallow directive to forbid spiders and search engine robots
Just like .htaccess, robots.txt resides at the document root of your domain. It’s a text configuration file containing directives or rules any well behaved web spiders or search engine robots should respect. While you can use .htaccess to forcibly prohibit any visits (including those of human visitors) to a certain part of your site, robots.txt …
Use robots.txt Disallow directive to forbid spiders and search engine robots Read More »