Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
- scripting web 2.0 sites to download howto
November 14, 2010, 3:28 pm
rate this thread
using tools like wget
or LWP because the sites dont have static contents.
Many web sites, even after you fill in a field and the site 'returns'
the data (such as links to files or even worse links to pages with
links to files), the data is not 'visible' in the sense that you
cannot save the content of the screen and be able to access the links
on the page to write a script. you must highlight each link and copy
its location by hand and then write a script to do your downloading.
i see that there are javascipt tools to work with this that can be
integrated into the browser, such as greasemonkey and Chickenfoot.
Are there any similar perl tools?
- » FAQ 5.3 How do I count the number of lines in a file?
- — Next thread in » PERL Discussions
- » FAQ 6.15 How can I print out a word-frequency or line-frequency summary?
- — Previous thread in » PERL Discussions