Multiple Consumption of single stream

2020-06-10 03:52发布

问题:

I want to know if its possible that multiple functions can consume single stream in node.js. If yes How can this done? Is it possible to pipe to multiple destinations?

I want to use the stream in two different functions which are parallel. I am doing the parallel flow using the async module. So will it possible to say issue the pipe() statement inside each of these functions?

Thanks in advance.

回答1:

Yes, it's possible, easy and common. The following is a piped data stream from a single source to multiple sources. It shows you the one anonymous callback function that gets placed on the event loop that contains the write function streams that do the actual write work:

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.on('data', function (data) {                                  
  console.log(data.toString('utf8'));                              
  ws1.write('1: ' + data);                                       
  ws2.write('2: ' + data);                                       
});

An easier way is to use the .pipe() functions.

var fs  = require('fs');

var rs1 = fs.createReadStream ('input1.txt');                      
var ws1 = fs.createWriteStream('output1.txt');     
var ws2 = fs.createWriteStream('output2.txt');

rs1.pipe(ws1);
rs1.pipe(ws2);

The .pipe() allows you to do nifty things like pipeline chaining in the future for pipeline manipulation, very similar to the unix concept of something like du . | sort -rn | less where you can use multiple pipes to handlers.