Robots.txt bocking user sites

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
Hi All,

I have a question regarding the use of the robots.txt file. I want to
allow acces to all folders and files of all the users except one.
Every user has a site, and I want to block
access to all robots to the site that belongs to a user called test.
How should my robots.txt be written??

could this work:

User-agent: *
Disallow: /home/test/public_html

Or maybe:

User-agent: *
Disallow: /~test/

Any ideas??
Thank you all,

Re: Robots.txt bocking user sites

Quoted text here. Click to load it

The latter is correct. Since all pages are allowed by default, this line
is redundant:

So you can reduce your robots.txt to just this:
   User-agent: *
   Disallow: /~test/

As usual when someone mentions robots.txt, I feel like I need to address
a common misconception (just in case): robots.txt *prohibits* entry but
doesn't *prevent* entry. Analogously, a "NO ADMITTANCE" sign on a door
prohibits entry, but you'll need to put a lock on the door to actually
prevent people from getting in.

Have you read the robots.txt Web site? It's short and can tell you what
you need to know.


Philip /
Whole-site HTML validation, link checking and more

Re: Robots.txt bocking user sites


Thanks for the reply!! I'll check the website you mention too.


Quoted text here. Click to load it

Site Timeline