User tracking - abuse?

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
I have a web site that seems to be getting hammered by lots of bots.

Some genuine, some questionable.  Some just users that seem to be  
downloading the site en-masse.  I have started using php & mysql to track  
the IP, host, browser, proxy of all visitors and am trying to gauge how many  
pages to serve within one hour before restricting access.

What do you consider to be a reasonable number of visits per hour?  Do many  
bots rush through a site?

I know this is a vague question, but at least you guys may have some  
experience in this area.  Before you say it I have tried to Google for  
this - no luck at all.

Is there a better way to block abuse of a site???


Re: User tracking - abuse?

Nel wrote:

Quoted text here. Click to load it

Hi Nel,

Why do you want to restrict access to your site?
If your site is popular, I expect that the extra bandwidthbill of your ISP  
is a luxuryproblem.

I also used bots to download a site on my local HD for offline browsing  
(this was before ADSL was cheap).

So why do you want to restrict your visitors?

just my 2 cent.

Erwin Moller

Re: User tracking - abuse?

"Erwin Moller"  
Quoted text here. Click to load it

It was mainly for bots that seem to refuse to abide by the robots.txt file.  
LinkWalker was one that springs to mind - just keeps on taking.


Site Timeline