I use Perl readdir to get a list of files, however the directory contains more than 250,000 files, and this leads to a long time (more than 4 minutes) to execute readdir and uses more than 80 MB of RAM. Since this was to be a repetitive job every 5 minutes, this delay time would not be acceptable.
Additional Information: Another job will fill out a directory scan (once a day). This Perl script is responsible for processing files. A file counter is specified for each iteration of the script, currently 1000 per run. A perl script should run every 5 minutes and process (if applicable) up to 1000 files. A file limit is designed to allow downstream processing, since Perl moves data to a database that launches a complex workflow.
Is there any other way to get file names from a directory, ideally limited to 1000 (set by a variable), which would significantly increase the speed of this script?
source
share