DBI (with Oracle) 'out of memory' error

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View

I am using perl 5.6 on FreeBSD. (This is perl, v5.6.1 built for i386-
DBI version 1.48 .

I am running a select command which returns 355,000+ rows. While
reading it into a hash, I should have 280,000+ hash-keys when I am
done constructing the hash.

But the script is aborting with message: Out of memory!

What might be wrong? If I can keep the hash-size down (deal with a
key, complete all calculations assoiated with that key, then undef
it), will it help?

$$rh_result = 2 ;
do calculations for this hash-key
undef  $$rh_result  ;   # I do not know the syntax to undef,
but will find it out.

Please advise.

Re: DBI (with Oracle) 'out of memory' error

dn.perl@gmail.com wrote:
Quoted text here. Click to load it

$rh_result->{ 'field01' } is more readable.

Quoted text here. Click to load it

perldoc -f undef

Quoted text here. Click to load it

Please show us the relevant code.

In short, you should probably be doing:
Prepare SQL
Bind columns
Iterate over each row using while and fetch.
    Using the binded variables, do calculations, store results

If you're doing that and it runs out of memory, then we really
really need to see your code.

Re: DBI (with Oracle) 'out of memory' error

Quoted text here. Click to load it

Perl 5.6 is really old. 5.10 is the current release.

Quoted text here. Click to load it

Since your perl is very old, I assume your computer is rather old, too.

Hashes need a lot of memory. With 280k hash keys, I estimate that your
hash needs at least 60 MB. If your hash has multiple levels, it can be a
lot more.

Also you don't say how you run the select command. If you use one of
the selectall_* methods you'll get a nested data structure with 355,000+
times the number of colums elements - probably a few million elements in
total, which would use another few hundred MB ...

How much memory does your script use just before it runs out of memory?
Is it near the available virtual memory? Do you have any resource limits

Quoted text here. Click to load it

Yes, that will help. But if you can process your data one key at a time,
why store them in a hash in the first place?


Re: DBI (with Oracle) 'out of memory' error

Quoted text here. Click to load it

I do not know how much memory my script was using before crashing, and
do not even know how to find it out. But your answer helped because I
was using a hash with 300,000 keys and each key had sub-levels.
$$rh_result = 4
 ... to ...
$$rh_result = 10.
Now I am deleting a key after its relevance is over.
And the script is not crashing.

Quoted text here. Click to load it

Because each from_queue may have more than one entries assigned to it
and I can do calculations about that queue only after all the entries
are read. And I just prefer to store the data in a hash rather than
my last_queue = "" ;
while(my @row = fetchrow_array)  {
    my $current_queue = $row[0] ;
    populate $$rh_result($current_queue) ;
    if( $current_queue ne $last_queue and $last_queue ne "" )  {
        do-stuff about $$rh_result($last_queue) ;
        NOW added: delete $$rh_result($current_queue) ;
    $last_queue = $current_queue ;
do-stuff about $last_queue if $last_queue ne "" ;

The script was working without a hitch for 2-3 months before the
recent crash. But such problems help in understanding better how the
whole thing works.

Thanks for the responses.

Site Timeline