I have found a lot of information on how to pump, or pipe data from a read stream to a write stream in Node. The newest version even auto pauses, and resumes for you. However, I have a different need and would like some help.
I am writing a video file using ffmpeg (to a local file, not a writeable stream), and I would like to create a readstream that reads the data as it gets written. Obviously, the read stream speed will surpass how quickly ffmpeg encodes the file. What will happen when the read stream reaches the end of data before ffmpeg finishes writing the file? I assume it will stop the read stream before the file is fully encoded.
Anyone have any suggestions for the best way to pause/resume the read stream so that it doesn't reach the end of the locally encoding file until the encoding is 100% complete?
In summary:
This is what people normally do: readStream --> writeStream (using .pipe)
This is what I want to do: local file (in slow creation process) --> readStream
As always, thanks to the stackOverflow community.
The growing-file module is what you want.