How to set a limit of files when processing a directory

Started by Mauricio Villablanca, March 31, 2011, 08:38:11 PM

Previous topic - Next topic

Mauricio Villablanca

Is there a parameter to set a maximum of files to be read when processing a folder? Processing thousands of images in one call is CPU-intensive and I wonder how I can go about processing a max of say 100 images.

I wrote a PHP script that reads a json generated by exifTool (it reads a folder nonrecursively), parses it,  adds metadata to a database then moves the files to a different location to prevent exifTool from reading them again. That still leaves me with the problem of handling a huge amount of images if an user places them in the folder.

Any tip, Phil?

Phil Harvey

#1
I think the best idea would be to modify your script to process 100 files at a time.  However, there is a VERY sneaky way that you could force exiftool to exit after 100 files using the -if option:

exiftool -if '$$::myCount++ < 100 or exit' ...

(of course, use double quotes instead of single quotes if you are in Windows)

The only problem with this (aside from the extreme sneakiness) is that the terminating array bracket won't be printed if you are using the JSON output, but you could fix this with:

exiftool -json -if '$$::myCount++ < 100 or print("]\n"), exit' ...

(here you will have fun with the quoting if you are in Windows)

- Phil
...where DIR is the name of a directory/folder containing the images.  On Mac/Linux, use single quotes (') instead of double quotes (") around arguments containing a dollar sign ($).

Mauricio Villablanca

#2
The problem with modifying my script is that I'm using exiftool to retrieve whatever images the directory (*e7*.jpg) has:

exiftool.exe -j -d %Y-%m-%d_%H:%M:%S -ext JPG -c %.8f -if  "$filename=~/e7/i" "c:\myFolder"

So if I force my script to read X images, exiftool might still return much more that making the script inefficient.

The workaround shows promise. I'll give it a shot.

UPDATE: it worked fine.

Thank you again.