Reading multiple files in parallel and writing the

2019-08-30 03:24发布

I'm trying to handle a asynchronous action that read multiple files from a folder at a same time and writes new ones in a different folder. The files that I read are by pair. One file is the data template and the other one is about the data. According to the template, we process the data from the related data file. All the information that I got from the both files are inserted into an object that I need to write in JSON into a new file . The code below works perfectly if there are only one pair of these files (1 template and 1 data):

for(var i = 0; i < folderFiles.length; i++)
{
    var objectToWrite = new objectToWrite();
    var templatefileName = folderFiles[i].toString();
    //Reading the Template File
    fs.readFile(baseTemplate_dir + folderFiles[i],{ encoding: "utf8"},function(err, data)
    {
      if(err) throw err;
      //Here I'm manipulating template data

      //Now I want to read to data according to template read above
      fs.readFile(baseData_dir + folderFiles[i],{ encoding: "utf8"},function(err, data)
      {
        if(err) throw err;
        //Here I'm manipulating the data
        //Once I've got the template data and the data into my object objectToWrite, I'm writing it in json in a file
        fs.writeFile(baseOut_dir + folderFiles[i], JSON.stringify(objectToWrite ),{ encoding: 'utf8' }, function (err) 
        {
            if(err) throw err;
            console.log("File written and saved !");
        }
      }
    }
}

Since I have 4 files so two template files and two related data files, it crashed. So I believe I have a problem with the callbacks... Maybe someone could help me to figure it out ! Thanks in advance !

1条回答
ゆ 、 Hurt°
2楼-- · 2019-08-30 03:46

It is happening because readFile is asynchronous, so for loop does not wait for it to be executed and goes on with the next iteration, and it eventually finishes all iterations really fast, so by the time readFile callback is executed, folderFiles[i] will contain a name of the last folder => all callbacks will operate only this last folder name. The solution is to move all this stuff to a separate function out of the loop, so closures will come in handy.

function combineFiles(objectToWrite, templatefileName) {
  //Reading the Template File
  fs.readFile(baseTemplate_dir + templatefileName,{ encoding: "utf8"},function(err, data)
  {
    if(err) throw err;
    //Here I'm manipulating template data

    //Now I want to read to data according to template read above
    fs.readFile(baseData_dir + templatefileName,{ encoding: "utf8"},function(err, data)
    {
      if(err) throw err;
      //Here I'm manipulating the data
      //Once I've got the template data and the data into my object objectToWrite, I'm writing it in json in a file
      fs.writeFile(baseOut_dir + templatefileName, JSON.stringify(objectToWrite ),{ encoding: 'utf8' }, function (err) 
      {
          if(err) throw err;
          console.log("File written and saved !");
      }
    }
  }
}

for(var i = 0; i < folderFiles.length; i++)
{
    var objectToWrite = new objectToWrite();
    var templatefileName = folderFiles[i].toString();

    combineFiles(objectToWrite, templatefileName);
}
查看更多
登录 后发表回答