Reading and saving URL content

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
In my current PHP project, I have to read pages from a website and
parse the data. If using the Internet browser, I would have to do the
following steps:

Step 1: specify the state to display by using:
            and it will redirect to

Step 2.
Step 3.
and go on...

I can use fopen($url, 'r')  and fgets() to get the contents and read
all the data back into a file. That's the easy part. But the data
returned is not the correct one at Step 1. It returns something like
the custom page-not-found error. I guess the redirection causing that
If I continue to read the content as Step 2, I get an empty page: seems
like the 'state' session variable has not been defined in Step 1.

Could someone help me with this problem ? Thanks,

Re: Reading and saving URL content wrote:
Quoted text here. Click to load it

You could use a external tool for fetching the pages in question, wget has
been around for quite many years and has been proved to be an excellent tool
for this, allows you to fake headers in case the site you are getting the page
from requires "reference page" to server some of the pages..


Re: Reading and saving URL content

On Tue, 03 Jan 2006 03:40:18 -0800, kinh wrote:

Quoted text here. Click to load it

See the CURL functions:
and on your favorite mirror , of course.

Site Timeline