speeding up reading opendir...

Do you have a question? Post it now! No Registration Necessary.  Now with pictures!

Threaded View

I have a system, which reads in the entire tree - all files. As of
now, there are 1312 folders to read.

1st time it takes 53-60 seconds to read. The data is somehow cached,
2nd time time is takes 2-3 seconds.

Is there a way to "cache" data beforehand? Like "preparing" the


Re: speeding up reading opendir...

jodleren wrote:
Quoted text here. Click to load it

Before what? Before you read them?

Bit of a logical impasse there ;-)

If its *nix, you might execute a cron script every few minutes and read  
the whole directory structure, which will bring it into the disk file cache.

Of course under heavy I/O load that caching may get flushed again..

Quoted text here. Click to load it

Re: speeding up reading opendir...

Quoted text here. Click to load it

You might simply direct the output of the dir
command into a file (or string or array depending
on which exec type of command you use) and then
parse that yourself.  It should be FAR faster.

For reference,
It took my Win XP system about 315 seconds to do:
C:\>dir /s > delme.dir
with the entire c: drive, about 140000 files
totaling about 44 gigabytes in 32000 directories.
The resultant file was about 10 megabytes

Csaba Gabor from Vienna

Re: speeding up reading opendir...

Quoted text here. Click to load it

If you're running this on Windows, you're probably seeing the Windows
cache coming into play, which is why it runs faster the second time.

As previously suggested, try usign exec() to output a directory to a
file, and then parse that instead.


Site Timeline