Question about testing for page blocking

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
Hi all
Normally when I request a link from a site to mine, I normally check
for robots.txt to see if they are a playing fair game on their links
What other tests can I do to see if the site is blocking pages, on
purpose, or otherwise ?

What things should I look out for ?

Some links pages have zero PR, so the page could be new, and looking
at the cached page normally shows no page cached.

I just want to be able to check a site to see if they will show in the
search engines on the pages I link from.

Thank you for any help on this.

Re: Question about testing for page blocking

Quoted text here. Click to load it

Always check the source code and copy the link to your
browser rather than clicking it.
Have a look at this for an example, /
Hover over the links and they look genuine enough,
copy and paste them and you will see what they are
really doing.

Re: Question about testing for page blocking

Quoted text here. Click to load it
Thats exactly my problem, knowing what to look for. I can see what the
above is doing, but no idea if it will show up in the search engines.
Some "funny" URLs do show up in search engines, others don't.
And not all php pages show. So it is not a clean cut test.

In google you do see php files listed, so I am really at a loss to
know exactly how to test it.

Are there any online tools that test for this ?

I did save off one link page from another site and then opened it up
in dreamweaver. All the links were missing. Then I looked and found
that all the code for the links were in a javascript file. Naughty me

I just dont want to waste my time getting links from sites that will
never show up in the search engines.

I know my links pages don't show up yet as they are only a week or two
old. (some only days).

Thank you for your help
kind regards

Site Timeline