Do you have a question? Post it now! No Registration Necessary. Now with pictures!
October 4, 2005, 5:42 am
rate this thread
It was learned that only 1/3 of the entire web is indexed, the
traditional approach of crawling the web by a single force of
commercial crawlers dont address the task. As well as Grub, the
distributed crawler, cannot display what I expeced. Is there any better
idea can cover the web more efficiently?
And the revisiting strategy can only rely on propability analysis, are
there any better mechanism in addressing the task? Appriciated for
- » ssh on command line: force using a group size (prime size) of 1024 (and no...
- — The site's Newest Thread. Posted in » Secure Shell Forum