Do you have a question? Post it now! No Registration Necessary. Now with pictures!
March 31, 2008, 1:38 pm
rate this thread
I am trying to make 1000 searches from a site from a keyword file. I
want to automate these searches. I have copied the search form of
site and modified the post part of url so i can make necessary
modifications for the automation.
// structure of form
<form post "http://www.XXXXXXXX.com/processform.php ">
select box>// a selectbox with 100 options
Program will be as follows
While not end of file
1st step: Read a keyword from the file and assign it $value
For each option $op on the form
2nd step: $op is selected
3rd step: enter $value to inputbox
5: save the result to a file, $result
6: parse $result and save to database
There is no problem until step 5.When i fill and submit the
form a page is opened from the remote host and contains the table
data1-- data2 -- data3
data1 and data2 is text and data 3 is a link
and i want to save data1 data2 and data3link to a database.
but when the control is gone to formprocessor on the remote host i
have no idea how to complete step 5. I mean how can i gain control and
save the result to a file and parse it.
Also this is the way i can to think of but necesserly the feasible
solution. If you can think of other options, i am all ears :)
Thank you very much for your kind response.
Re: web crawling program
You will need to use CURL or similar to submit the form so you can get
the information back.
And BTW - do you have permission to do this? If I saw someone doing
this on one of my sites, they'd be blocked immediately - if not sooner.
Remove the "x" from my email address
JDS Computer Training Corp.
- » Is Apache Needed w/IIS (Windows 2003) for PHP Install
- — Next thread in » PHP Scripting Forum