Using Folder Names for Datetimeoriginal

Started by GeoFreesoul, September 02, 2019, 10:34:05 AM

Previous topic - Next topic

GeoFreesoul

Hi:

I've managed to mangle together various leads from stackoverlow and this forum to get my Google Takeout folder ready for importing into Photos, (a task I would not recommend to anyone during labor day weekend.)

My first import (without any exiftool) , just dragging the Google Takeout folder into Photos, resulted in thousands of photos from the past that showed up as the day I imported them, among other problems.  Obviously this was not acceptable, so I found exiftool looked at the sidecar .json files and realized I could snatch the datetimeoriginal bit and reimport.

Here's my coddled code from another website to drill through the thousands of directories to build a list of file with no original dates.

>exiftool -p '$directory/$filename' -r -if '(not $datetimeoriginal) and $filetype eq "JPEG"' . > nodates.txt

Next we cycle through the list and fix said offending images.  It applies the "photo taken timestamp" to original date

>while read filename; do exiftool -tagsfromfile ${filename}.json "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s  ${filename}; done < nodates.txt

I'm sure both those commands can be done with one command forgoing creating a list but that's beyond my expertise.  Regardless, half of the photos returned errors because their filenames had spaces.  I then used detox command and fixed that and repeated my steps.  The new files imported to their correct times - whallah!

Now, it seems I have reached an impasse with renegade files that have no sidecar .json.  My investigation revealed these files do live inside a folder named the date(s) of the file so imageexample.jpg, for instance is living inside the 2002-03-01 folder - to make matters worse though, some folder names are like 2002-02-01 #4, or some version of the original date with another set of numbers on them.

My final mission is to use the folder names to create the datetimeoriginal meta for these renegade files ignoring any numbers past the date. 

I perused the forms and this post https://exiftool.org/forum/index.php/topic,9707.msg50435.html offered some clues.  However, it probably wont work for folder names that have extra bits after the date.  Any help appreciated on doing this efficiently or putting it all together in one beautiful bash script ;)

Hayo Baan

For setting the DateTimeOriginal from the directory names, try this:

exiftool "-DateTimeOriginal<${directory;$_ = /^(\d{4})[-:]?(\d{2})[-:]?(\d{2}).*/ ? qq($1:$2:$3) : undef} 00:00:00" -ext jpg -r DIR
Hayo Baan – Photography
Web: www.hayobaan.nl

StarGeek

Quote from: GeoFreesoul on September 02, 2019, 10:34:05 AM
Next we cycle through the list and fix said offending images.  It applies the "photo taken timestamp" to original date

>while read filename; do exiftool -tagsfromfile ${filename}.json "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s  ${filename}; done < nodates.txt

This is Common Mistake #3.  Exiftool's biggest performance hit is the startup time and looping through each and every file will significantly increase the time running the command.  Especially since you already created a text file with a list of all the files.  Try this instead
exiftool -ext jpg -TagsFromFile %d%f.json "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s -@ nodates.txt

The -@ (Argfile) option will take your "nodates.txt" file as a list of files to process.  The %d%f.json will take the directory and base filename from the file being processed from that list, add the .json extention, and use that as the source file to copy from.  You'll find that this is significantly faster than using a loop.  Also, using the -ext (extension) option to only process jpegs is better than doing a $filetype comparison as that will still take time to process the json files, doubling the time spent processing files.

Another variation on Hayo's command
exiftool -if "not $DateTimeOriginal" -ext jpg "-DateTimeOriginal<${directory;$_ = /(\d{4}-\d\d-\d\d)/ ? $1 : undef} 00:00:00"  <DIR>[/tt]

I added an -if check to avoid overwriting files that had already gotten data from previous processing with the json files.
"It didn't work" isn't helpful. What was the exact command used and the output.
Read FAQ #3 and use that cmd
Please use the Code button for exiftool output

Please include your OS/Exiftool version/filetype

GeoFreesoul

Quote from: StarGeek on September 02, 2019, 12:50:03 PM
Quote from: GeoFreesoul on September 02, 2019, 10:34:05 AM
Next we cycle through the list and fix said offending images.  It applies the "photo taken timestamp" to original date

>while read filename; do exiftool -tagsfromfile ${filename}.json "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s  ${filename}; done < nodates.txt

This is Common Mistake #3.  Exiftool's biggest performance hit is the startup time and looping through each and every file will significantly increase the time running the command.  Especially since you already created a text file with a list of all the files.  Try this instead
exiftool -ext jpg -TagsFromFile %d%f.json "-DateTimeOriginal<PhotoTakenTimeTimestamp" -d %s -@ nodates.txt

The -@ (Argfile) option will take your "nodates.txt" file as a list of files to process.  The %d%f.json will take the directory and base filename from the file being processed from that list, add the .json extention, and use that as the source file to copy from.  You'll find that this is significantly faster than using a loop.  Also, using the -ext (extension) option to only process jpegs is better than doing a $filetype comparison as that will still take time to process the json files, doubling the time spent processing files.

Thanks, that significantly decreased the time creating the file was taking from around 2 hours to under one minute.  Common mistake noted!

Quote from: StarGeek on September 02, 2019, 12:50:03 PM
Another variation on Hayo's command
exiftool -if "not $DateTimeOriginal" -ext jpg "-DateTimeOriginal<${directory;$_ = /(\d{4}-\d\d-\d\d)/ ? $1 : undef} 00:00:00"  <DIR>[/tt]

I added an -if check to avoid overwriting files that had already gotten data from previous processing with the json files.

I'm using terminal on os x and was given this error on that command (including Hayo Baan's original bit as well):
-bash: -DateTimeOriginal<${directory;$_ = /(\d{4}-\d\d-\d\d)/ ? $1 : undef} 00:00:00: bad substitution

GeoFreesoul

Quote from: Hayo Baan on September 02, 2019, 12:18:29 PM
For setting the DateTimeOriginal from the directory names, try this:

exiftool "-DateTimeOriginal<${directory;$_ = /^(\d{4})[-:]?(\d{2})[-:]?(\d{2}).*/ ? qq($1:$2:$3) : undef} 00:00:00" -ext jpg -r DIR

Thanks, received error:  Could this be a result of using bash on os x terminal instead of linux?
-bash: -DateTimeOriginal<${directory;$_ = /(\d{4}-\d\d-\d\d)/ ? $1 : undef} 00:00:00: bad substitution

StarGeek

Sorry, wasn't paying close enough attention to realize you were using bash.  Swap the single/double quotes to avoid bash from trying to interpret $Directory as a bash variable.
"It didn't work" isn't helpful. What was the exact command used and the output.
Read FAQ #3 and use that cmd
Please use the Code button for exiftool output

Please include your OS/Exiftool version/filetype