How to properly handle lengthy operations?

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
I'm developing a photo album web application for use on a web site, and
I'm running into a problem with lengthy operations. My application
allows the user to import a number of images at once into a photo
album. For each image that gets imported, I create two thumbnail images
(small and medium) and insert some data into a database. The thumbnail
generation process takes some time and, for relatively large amount of
photos, the application apparently times out. For example, if I import
40 pictures at once, 15 or so get imported successfully, but the app
then seems to time out. So I then have to import the images that are
left (that didn't get imported).

How can I properly handle such a lengthy operation? Ideally, I would
like to let the user know what's going on (and how much progress has
been made). And I certainly want all of the photos (no matter how many)
to be imported at once.

Are there programming paradigms that work well for situations like
this? What tips do people here have for such a situation? Any help
would be greatly appreciated! Thanks.

Re: How to properly handle lengthy operations?

I haven't needed to use this yet, but I was considering what I would do
if I hit a problem like this just a few weeks ago. What I came up with
was some ajax feedback.  During the exec of your script, you should
write to a file exactly what you would like to appear as feedback to
the user.  You would then write some js to grab the contents of the
file that you have written to and display it on the page.

Haven't really tried it, but in theory its a sound idea as far as I can

As for timing out, im sure you've heard of set_time_limit()?

I'd like to read other people's ideas.

Jonah Bishop wrote:
Quoted text here. Click to load it

Re: How to properly handle lengthy operations?

Jonah Bishop wrote:

Quoted text here. Click to load it


You should have a look at script timeout.
This can be modified in php.ini or via ini_set().

Here is more info on all settings:

Some relevant settings:
- max_execution_time  
to set the time your script can take to finish.
- post_max_size  
to change the amount of data a user can post (this includes the file you  
want to upload)
- memory_limit  
to give PHP more memory.

If you want to report back to the user how far your computations went,  
simple use outputbuffering and flush everytime a line to the clientbrowser.
eg: finished picture 5 of 36

Check for ob_start() and ob_flush().
(I am unsure if you actually need ob_start when using ob_flush, but it  
doesn't hurt.)

Good luck.

Erwin Moller

Re: How to properly handle lengthy operations?

Jonah Bishop wrote:
Quoted text here. Click to load it

If this is a process that takes long enough to timeout the default
script execution limit, then it is probably way too long for an
end-user to sit and wait for, so simply increasing the timeout should
NOT be an option.  Yes, some AJAX could give the user some visual
feedback that the server just didn't choke on their request and
something is actually happening, but I know that I, personally,
wouldn't want to sit and watch a page, waiting for some progress bar to
fill if I didn't have to.

If this is the final step in the process (ie: after the files are
uploaded, no more user input is required), then consider letting
another php script handle the long operation.  You have a couple of
options here, then.  You could just have a script scan the system for
new files to process and run it periodically via cron, or you could use
the system [1] function and call another script to run immediately
after the files are successfully uploaded.  (There is a comment about
halfway down the page giving an example of how to sent the command as a
background process, so it doesn't make your script wait).  Both of
these methods have the benefit that the heavy processing time takes
place outside of the browser request, meaning that you can do your
processing while allowing the user to continue browsing the site.

If you need some kind of visual feedback to the user, you could have
the processing script update a field in a table giving the current
status, and then query that for the user.  Or, after the files are
uploaded, just give the user a message saying something like, "Your
files should appear in your photo album shortly."

[1] -

Re: How to properly handle lengthy operations?

I totally agree with you, mootmail. I'd rather not override the script
timeout value, because I'm guaranteed (to some extent) that this script
will take longer than that.

This is indeed the final step in the importing process, so I really
like the idea of having another PHP script handle the long operation.
But how do I get *that* script to avoid timeouts? Is sending it to the
background sufficient? Won't it still be susceptible to the PHP timeout

Additionally, how would I pass parameters to such a script?

Many thanks!
-- Jonah

Re: How to properly handle lengthy operations?

Jonah Bishop wrote:
Quoted text here. Click to load it

As far as I know, scripts executed in command-line mode do not have a
timeout value (I have some scripts that take 2+ hours to run and don't
timeout).  I'm not sure if starting a script as a background process
from a web script carries with it the default timeout, I've never tried
it that way.

Either way, you could do set_timeout_limit(0) just to be safe.  After
all, once it's in the background, it doesn't really matter how long it

As for passing parameters, I've never needed to, but I suspect a good
place to start would be here: A little ways
down it starts talking about how to pass arguments to a script from the
command line.  It looks like you can pass arguments to the script and
then access them via $argv in your code.

If that doesn't work, and you decide to go the cron periodic route,
then you could always just keep a table 'queue' holding all the
parameters that should be passed, which your script could just parse
row by row to get the parameters.

Site Timeline