Do you have a question? Post it now! No Registration Necessary. Now with pictures!
- Posted on
- Reading and saving URL content
January 3, 2006, 11:40 am
rate this thread
parse the data. If using the Internet browser, I would have to do the
Step 1: specify the state to display by using:
and it will redirect to
Step 2. www.foo.com/list.asp?city=sanjose&page=2
Step 3. www.foo.com/list.asp?city=sanjose&page=3
and go on...
I can use fopen($url, 'r') and fgets() to get the contents and read
all the data back into a file. That's the easy part. But the data
returned is not the correct one at Step 1. It returns something like
the custom page-not-found error. I guess the redirection causing that
If I continue to read the content as Step 2, I get an empty page: seems
like the 'state' session variable has not been defined in Step 1.
Could someone help me with this problem ? Thanks,
Re: Reading and saving URL content
You could use a external tool for fetching the pages in question, wget has
been around for quite many years and has been proved to be an excellent tool
for this, allows you to fake headers in case the site you are getting the page
from requires "reference page" to server some of the pages..