Node.js -- reading from a stream

Started by benw, September 28, 2015, 07:59:04 PM

Previous topic - Next topic

benw

I'm experimenting with using exiftool in conjunction with Nathan Peck's Node.js wrapper, and AWS Lambda.

The basic implementation I have works great, but I am encountering difficulties working with larger files. Some of this has to do with Lambda's built-in limitations but I think I could work around those if I could pass the S3 data into exiftool as a stream, instead of needing the whole file to be there. However I think I've read that this is not possible, or not easy, "(because ExifTool listens on the same input stream for processing commands and a terminating -execute sequence, it can't also listen for image byte[] data)."

Is there another approach I could try?

Phil Harvey

You could read the command-line arguments from a disk file, which would free up stdin/stdout for the file i/o.

- Phil
...where DIR is the name of a directory/folder containing the images.  On Mac/Linux/PowerShell, use single quotes (') instead of double quotes (") around arguments containing a dollar sign ($).

benw

I think the npm module actually does that already -- it uses `stdin.write()` at any rate -- and it seems to accept chunked data from a stream.

However the implementation waits for a `close()` before trying to parse the metadata, so I'm back to the same problem where I have to download, or hold in memory, an entire file in order to extract the metadata.

On the assumption that the data I need is towards the beginning of the file, I've experimented with requesting a limited range of the file, but if I specify some arbitrary number of bytes then I don't get a response that exiftool can recognize and work with.