scripting web 2.0 sites to download howto

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
It is more and more difficult to script download stuff from websites
using tools like wget
or LWP because the sites dont have static contents.

Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.

i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.

Are there any similar perl tools?


Re: scripting web 2.0 sites to download howto

Quoted text here. Click to load it

Perl supports a Selenium client interface:

Charles DeRykus

Site Timeline