Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
- fetch a manay URL's
February 10, 2005, 3:11 am
rate this thread
$tmp = get ('http://example.com ');
to get several webpages.
when they are all downloadded I parse the data and display some results
this kinda sucks right now because it waits for on URL to load before
loading the next one is loaded.
can some one point me to a tutorial or something on how to use a call back
function with a timeout option to do this?
any suggestions would be good all I realy want is a way to load all the
URL's at once and know when they are done loading or maybe wait X seconds or
so then move on.
remeber I am really new to perl
Thanks for your help
Re: fetch a manay URL's
> this kinda sucks right now because it waits for on URL to load before
> loading the next one is loaded.
So it doesn't suck :-D
For sucking in parallel see LWP::Parallel::UserAgent.
(go to CPAN: http://www.cpan.org/ e.g. http://search.cpan.org/search?
John Small Perl scripts: http://johnbokma.com/perl/
Perl programmer available: http://castleamber.com/
Happy Customers: http://castleamber.com/testimonials.html