Signals and local $@ - Page 2

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View

Re: it hurts when I press here

Quoted text here. Click to load it


Quoted text here. Click to load it

In the sense that it has three times as much 'voodoo coding' but
(hark!) nobody needs to look at the mess. Apart from that, these
opinion statements regarding 'proper error handling' would be more
convincing if they were expressed as opinion statements and came with
a reason eg "I'm generally of the opinion that exceptions shouldn't be
used for error handling because ...".

I'm using eval and die extensively, both for aborting some sequence of
processing steps from nested subroutines and for actual exception
handling and this works very nicely. Some remarks about Try::Tiny:

| finally (&;$)
| [...]
| This allows you to locate cleanup code which cannot be
| done via local() e.g. closing a file handle.

This would be more aptly described as 'This construct is necessary to
work around deficiencies of a sixty year old concept for "automatic
memory management" which is limited to manageing memory while all
other resources which need to be allocated and freed have to be
managed with explicitly written code. That's not a problem for Perl 5
which uses a more modern approach for automatic resource management
with support for stack unwinding, deterministic finalization and
automatic management of file handles *BUT* it will become a problem
with Perl 6, should that ever evolve beyond being an abandoned Haskell

| There are a number of issues with eval.

| Clobbering $@
| When you run an eval block and it succeeds, $@ will be cleared,
| potentially clobbering an error that is currently being caught.

That's an inherent limitation of the idea to use 'a global variable'
as 'the exception location' and part of the documented functionlity of
eval. The solution is that someone who wants to use $@ for exception
handling has to save the value in some other location before starting
any more complicated unrelated computation.

| Localizing $@ silently masks errors
| Inside an eval block, die behaves sort of like:
| sub die {
|            $@ = $_[0];
|            return_undef_from_eval();
| }
| This means that if you were polite and localized $@ you can't die in
| that scope, or your error will be discarded (printing "Something's
| wrong" instead).

'Localizing $@# does not 'silently mask errors', it localizes $@, that
is, it creates a lexically scoped binding for a global
variable. Because of this, changes to $@ while this binding is in
scope will not affect 'the outside world' anyhow. Since $@ is used for
exception propagation, this means that code which localizes $@ without
dealing with this possibility 'might silently mask an error' aka 'is

| $@ might not be a true value
| This code is wrong:
|         if ( $@ ) {
|                 ...
|         }
| because due to the previous caveats it
| may have been unset.

This example is incomplete: The code supposed to set $@ is
missing. Also, there were no general 'previous caveats': The first
situation roughly amounts to the following:

eval {

eval {

if ($@) {

plus the expectation that the value of $@ would reflect something
which happened during the first eval which it doesn't: This is a
coding error and needs to be avoided. Localizing $@ without dealing
with its 'special magic' is also a coding error.

| The classic failure mode is:
| [...]
| In this case since Object::DESTROY is not localizing $@ but still uses
| eval, it will set $@ to "".

That's the sole valid concern so far: Object destructors are executed
automatically during stack unwinding. Because of this, it is prudent
to write them such that they don't modify any kind of 'state
information' visible to unrelated parts of the program. That's the
downside of any 'convenience mechanism' which might lead to the
execution of subroutines in places where this isn't obvious when
looking at the code causing these invocation. 'Operator overloading'
suffers from the same problem. Not related to eval/ $@ in any
particular way.

NB: This text is an opinion statement itself and there might well be
valid counterarguments for anything contained in it.

Re: it hurts when I press here

On 5/25/2013 12:44 PM, Rainer Weikusat wrote:
Quoted text here. Click to load it

Good objection. Admittedly, I was mostly parroting the consensus about  
Try::Tiny. Still IMO most are better off with T::T than rolling an eval  
eval block. It's easy to forget the subtleties of Perl's exception  
handling... or remain blissfully unaware of them in the first place. For  
the latter reason, I can't comment in any depth.

But, if 'sleep10' had been using T::T -- instead of mangling $@ as it  
apparently did and which is easy to do -- the code would have arguably  
been cleaner and chances of mayhem reduced. Even visually, I'd prefer:
try{} catch {}  rather than: local $@;  eval;

That's not a big edge but with eval, you have also to remember to  
localize $@ and to ensure the block returns true. Plus there can be  
concerns as you cite of a DESTROY block which zaps $@. IIUC, T:T handles  
these problems for you

Both approaches involve discipline of course. Even with T::T,  a global  
$_ is in play if errors occur. But, at least there's not the potential  
of collateral damage to the exception model itself because of $@  

Charles DeRykus

Re: it hurts when I press here

Quoted text here. Click to load it

Consensus of whom? Obviously, all people who use Try::Tiny do so
because they're convinced that it is useful for them. But this doesn't
necessarily mean that it actually is. As someone wrote in a complete
different context: Science is not a democracy because 'truth' is not
'what most people believe in'.  

Quoted text here. Click to load it

The Try::Tiny documentation list three 'concerns' about 'Perl exception
handling' (I consider two of them invalid for reasons stated
elsewhere) but it contains no less than seven 'caveats' supposed to
apply to using it (and some of the open bugs -- apparently, bugs in
Try::Tiny are usually neither fixed nor closed -- are about people who
still didn't understand how to use it). Memoizing more than two times
as much 'subtleties' about using Try::Tiny in order to avoid learning
about 'the subtleties of Perl exception handling' doesn't seem like a
sensible tradeoff to me.

Not to mention that most details of the Try::Tiny code are quoted (in
simplified form) in the documentation and the author himself
consistently refers to them as 'ugly hacks' (I agree with this
assessment :->).

Quoted text here. Click to load it

And I prefer to use a facility for dealing with 'exceptional events'
(which change the 'normally linear' control flow of something) for
doing actual 'exception throwing and handling'. Otherwise, I'd use
some 'special return value' error signalling convention without the
overhead of the former. Mixing both in this way in order to work
around hypothetical bugs in other code (instead of fixing these) is IMHO
just totally bizarre: The point of having 'exceptions' in the first
place is that return values can be used for returning 'useful
information' and exceptions for 'exceptional events'.

I also don't quite understand why one would localize $@ and want to
propagate errors out of the scope of the local at the same time. The
easy way to accomplish that would be not localizing $@ to begin with.

Re: it hurts when I press here


Quoted text here. Click to load it

I'd like to add that I've meanwhile found a reason for doing this: I
tend to do 'stuff' from DESTROY methods, eg, using a class to
represent some kind of 'externally visible event' and 'sending' that
'event' (usually in form of a message sent via some socket) from the
class destructor. Because of this, code running from a destructor
might run into supposed-to-be-fatal runtime errors caused by something
which is external to perl, eg, the kernel. This implies that $@ should
be localized in such a destructor to avoid clobbering an exception in
the process of being thrown and that any 'other exception' caused by
code running from the destructor should also affect the program in
case execution continues. It normally wouldn't because perl will eat
any runtime error occuring during destructor exection. My present idea
for dealing with this looks like this:

    local $@;

    eval {

    x_push($@) if $@;

this being a 'DESTROY' method classes using this facility can
import. It will then invoke the actual class destructor (somewhat
uncreatively named XDESTROY) and push 'an exception which happened
while doing that' onto an array maintained by the 'run loop'
module. Should processing again end up in the top-level event loop,
the most recently added exception in this array will be thrown and so
forth, until the program either terminates or the exception array is
again empty.

Site Timeline