Retrying handled errors

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View
I just have a question about trapping and retrying errors especially
file locking or database locks or duplicate key errors.

Is there a way after you trap an error to retry the same line that
cause the error?

In many other languages you have the option of rertying certain errors.
In effect, its like a return to the exact same line. You can then retry
certain # of times and then produce an error if it keeps failing.

Re: Retrying handled errors

ImOk wrote:

Quoted text here. Click to load it

All I can think of is to use a loop
 while $result == null
if ($retries > '3') { //do stuff $retries++ }

or you can kill the script on error then reload the page

meh.. there is a way.. somehow.


Re: Retrying handled errors

flamer wrote:
Quoted text here. Click to load it

PHP5 doesn't have that feature, though i'd love to have that for some
situations, also.

I think c# has this feature, not sure though.

The best you could do is to nest a try/catch block within a loop and
use a control flag.

while($retry < 5) {
try{ .... } catch{$retry++; ... }

Re: Retrying handled errors

Quoted text here. Click to load it

I've done something like that, altough there is no direct language support  
for that, you just have to make one. Anyway, I've got a database interface  
class which I use for all queries, it has a built-in error reporting  
mechanism which sends errors to my email address. Now, when the error  
message contains the word "deadlock" I try to rerun the query, keeping track  
how many times rerun has been attempted. Before another rerun, there's a  
random length sleep period 0.5 ... 1.0 seconds, and then it's retried. If  
after ten reruns it fails, I get the error report, but on most cases it runs  
succesfully after 3 or 4. It's really handy. We always get those pesky  
deadlocks once a day when a major task is performed during which the server  
hits a performance peak, but thanks to the deadlock solver, the queries  
eventually are ran even though it takes a little longer then.

"ohjelmoija on organismi joka muuttaa kofeiinia koodiksi" -lpk | Gedoon-S @ IRCnet | rot13(xvzzb@bhgbyrzcv.arg)  

Re: Retrying handled errors

ImOk wrote:
Quoted text here. Click to load it

If you use set_error_handler to collect the errors and debug_backtrace
to see what caused the error it is possible to retry *some* errors.

I could only get it working if the error occured inside a function.  If
you check the error message given to the event handler then you maybe
able to handle other errors, not sure if it can be done though.

Anyhoo the code below will try 3 times when a error occurs in any and
all functions.



function retryErrorHandler($errNo, $errMsg ) {

    static $attempts = 1;
    $maxAttempts = 3;

    $backTrace = debug_backtrace();

    print "<br><br>Attempt $attempts<br>$errMsg<br>";

    if ( $attempts >= $maxAttempts ) {
        $attempts = 1;
    } else if ( isset($backTrace[1]) &&
isset($backTrace[1]['function']) && isset($backTrace[1]['args']) ) {

        // $backTrace[0] is info about the call to retryErrorHandler
        // we need info about the previous function that triggered the
        // so $backTrace[1] is used if a function + its arguments are

        $func = $backTrace[1]['function'];
        $args = $backTrace[1]['args'];

        print "Retry function $func <br>";

       //Hmm..The error handler is only called the second time round if
the handler is reset
        call_user_func_array( $func, $args );

    } else {
        print "Can't retry error: $errMsg<br>";
        $attempts = 1;


$old_error_handler = set_error_handler('retryErrorHandler');

// trigger a few errors...

// will not retry, not enough info returned by debug_backtrace to try

// not enough info to try include() errors again - the dodgy filename
isn't in the backtrace details
// assume its because include is a language construct not a function
include('is not a file.php');

// will retry three times
fopen( 'no file', 'r' );

// will retry three times
mysql_connect( 'mysql wont connect' );

// will retry user functions as well as internal ones

function testFunc() {
    $x = 3 / 0;


Re: Retrying handled errors

This looks like a very good idea. I will have to study it.

But you only need retries for cases where there is a chance of
recovery. E.g. Syntax errors or missing includes will never recover.
Neither will divide by zero.

The main purpose of retries as far as Im concerned has to do with file
and resource connections and locks.

Also using an error handler for unique key checking is better.You dont
have to perform a select first to see if the key already exists. Just
Insert and see if you get an error and retry with a different key.

Tim Hunt wrote:
Quoted text here. Click to load it

Re: Retrying handled errors

ImOk wrote:
Quoted text here. Click to load it

Syntax errors are fatal errors which can't be caught with a user
function (normally, anyways).

Quoted text here. Click to load it

This could also be done using INSERT ... ON DUPLICATE KEY...

Quoted text here. Click to load it

Re: Retrying handled errors

Quoted text here. Click to load it
True, the compiler in theory should catch them up front. But this is
PHP. Anything can happen.

Quoted text here. Click to load it

Usefull if you want to update a record if it exists. Not usefull if you
mean to insert a new record with a unique key.  Also is this standard

Richard Levasseur wrote:
Quoted text here. Click to load it

Re: Retrying handled errors

ImOk wrote:
Quoted text here. Click to load it

The specific instance i was thinking of was reading about being able to
catch fatal errors by registering an error handler and a shutdown
function.  The general idea (IIRC) being that in case of a fatal error,
the error message is in the output buffer, which you then parse in your
shutdown function and do what you want.

Quoted text here. Click to load it

Thats because PHP is magical ;)

Quoted text here. Click to load it

(Its in mysql for those who don't know)

I don't know, but can't find any evidence of it being standard; there
is also update...if exists, which this is probably true for, also.  I
wouldn't be surprised if other databases had a similar clause, though.

Personally, when it comes to duplicates, i try to pass that all down to
the database as much as possible using some sort of sequences approach.
 Just insert NULL and let it automagically make the unique value (not
so easy in mysql when you need two auto increments, though - another
reason i like postgres)

Quoted text here. Click to load it

Site Timeline