User defined tag reading external file

Started by StarGeek, March 02, 2015, 05:33:46 PM

Previous topic - Next topic

StarGeek

I'm creating a user defined tag that will read some data in from an external file and I'm looking for some advice on handling some things.

First of all, most perl examples I see are usually something like open (FILE, "$File") || die "Can't open File";  but it seems to me that a die command would be very bad, so I've changed it open (FILE, "$File") || return undef;.  Does this sound like a proper way to handle such an error, or is there a better way to go about it.

Currently, I'm reading the external file in line by line rather than reading the whole thing in at once, in case the external file happens to be large.  Currently, the largest file I have that might get read is 531 K.  Is there the possibility of problems if I switched and read the whole file in at once?

I'm assuming that this external file would be opened every time that ExifTool processes a file, so that if I process a directory with 1,000 files, the external file would be opened and read in 1,000 times.  Would that be that correct?  If so, could there be a work around?



"It didn't work" isn't helpful. What was the exact command used and the output.
Read FAQ #3 and use that cmd
Please use the Code button for exiftool output

Please include your OS/Exiftool version/filetype

Phil Harvey

#1
Hi StarGeek,

In a ValueConv expression, returning undef is the preferred way to exit with an error (but be sure you use a unique tag name, otherwise this will suppress an existing tag unless -a is used), although ExifTool will catch the die if the ValueConv is a string expression (but not if it is a reference to a subroutine).

On Windows, there is a limit to the amount of memory you can allocate, and I think you run into trouble around 200-400 MB.  On other platforms this isn't a problem.  If your memory requirements aren't a problem, then you could keep what you need in a static variable and read it only once.  Something like this:

unless ($Image::ExifTool::myData) {
    open(FILE,"myfile.txt") and binmode(FILE) or return undef;
    my $num = read(FILE, $Image::ExifTool::myData, 100000000);
    close(FILE);
    warn "oops, file was too big\n" if $num == 100000000;
}
...


But probably some more thought should be put into the names avoid name conflicts between FILE and myData, and ExifTool variables.

- Phil
...where DIR is the name of a directory/folder containing the images.  On Mac/Linux/PowerShell, use single quotes (') instead of double quotes (") around arguments containing a dollar sign ($).

StarGeek

Quote from: Phil Harvey on March 02, 2015, 07:47:15 PM
On Windows, there is a limit to the amount of memory you can allocate, and I think you run into trouble around 200-400 MB.

This is what I was faintly remembering.  I can't see needing that much memory.  What I'm reading in is a .picasa.ini file and the aforementioned 531K file is in a directory with about 5,000 files in it.  So there would need to be about a million files in a single directory to get near hundred meg level.
"It didn't work" isn't helpful. What was the exact command used and the output.
Read FAQ #3 and use that cmd
Please use the Code button for exiftool output

Please include your OS/Exiftool version/filetype