I'm still at the stage of weaning myself off dealing with image files using the standard Linux toolkit and I'm changing my method for downloading from my camera to take advantage of the features of exiftool. The CF card has 32 GB capacity and I don't delete images until it's more than half full. This command works fine for my current camera but does produce error messages about files that are already on my computer.
exiftool -r -o . '-Directory</home/anthony/Pictures/t/\${model;tr/ /_/;s/__+/_/g}/$CreateDate' -d %Y/%m/%d DCIM
Obviously, this isn't a major problem because of the directory structure (unless I download twice in a day) but I was wondering how to work around it. This not a minor error but it's an error that I don't care about in this case. The solution I came up with is a step backwards in terms of portability but it might be generally useful.
(exiftool -r -o . '-Directory</home/anthony/Pictures/t/\${model;tr/ /_/;s/__+/_/g}/$CreateDate' -d %Y/%m/%d DCIM) \
2>&1 | grep -v "already exists"
It's ugly but it works. I suppose that, if I want it to be portable, I'd replace grep with a tiny Perl script.
What you have done is fine except for the possibility of two original images having the same CreateDate (you'll miss one of them). I see similar error in my workflow, but I use ShutterCount so the file names are unique, and I just ignore the errors.
- Phil