Block Entire Subdomain with Robots.txt

Remove an Entire Subdomain from the Search Index

You might need to do this if you use your subdomain as a test site and leaving it unblocked would mean duplicate content. It’s really any easy fix: The robots.txt file needs to go only on the root of the SUBDOMAIN. Sorry for yelling, but this would be a neutron bomb on your primary domain.

You’ll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain’s content. Like this:

User-agent: *
Disallow: /

And that’s all there is to it. If your duplicate pages have already been indexed, they’ll drop out eventually.

2 replies
  1. alice
    alice says:

    “this would be a neutron bomb on your primary domain. ”
    does it means it will also restrict search engine from crawling main domain. ?

    Reply
  2. Vinith M
    Vinith M says:

    This trick works only if we have access of sub-domain code and hosting. What in case my domain is pointing to external IP which is not in my control ?

    Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply to Vinith M Cancel reply

Your email address will not be published. Required fields are marked *