Batch GeoLocate Using Web Source Using ETool

Started by Citadel, July 29, 2015, 05:56:13 PM

Previous topic - Next topic

Citadel

Hi

Newb here. Is there a possibility of batch searching a page for all JPG's, then running a script that finds those with geo tags before dumping that data into a map, CSV, HTML list? I understand I'm asking for three things there.

Any tools or a place to start this for myself would be really helpful.

Thanks
B

Phil Harvey

ExifTool will search a directory.  If you want to search images on a web page on the internet, you will need a front end to load the images and pass them to ExifTool.  ExifTool can output whatever you want from the metadata, based on any condition that you want.

- Phil
...where DIR is the name of a directory/folder containing the images.  On Mac/Linux/PowerShell, use single quotes (') instead of double quotes (") around arguments containing a dollar sign ($).

StarGeek

At the very simplest, you can do something like this to extract info from a image on the web:
wget -O-  URL | exiftool -TAG -

To get even more complicated, you could do something like shown in this StackExchange post to get a list of all the image links on a page and then pass each item on the list to the above command.

Most of the commands mentioned in that post are not standard with Windows if that's what you use, so you'd have to go download them from somewhere.
* Did you read FAQ #3 and use the command listed there?
* Please use the Code button for exiftool code/output.
 
* Please include your OS, Exiftool version, and type of file you're processing (MP4, JPG, etc).