I am trying to read a large file one line at a time. I found a question on Quora that dealt with the subject but I'm missing some connections to make the whole thing fit together.
var Lazy=require("lazy");
new Lazy(process.stdin)
.lines
.forEach(
function(line) {
console.log(line.toString());
}
);
process.stdin.resume();
The bit that I'd like to figure out is how I might read one line at a time from a file instead of STDIN as in this sample.
I tried:
fs.open('./VeryBigFile.csv', 'r', '0666', Process);
function Process(err, fd) {
if (err) throw err;
// DO lazy read
}
but it's not working. I know that in a pinch I could fall back to using something like PHP, but I would like to figure this out.
I don't think the other answer would work as the file is much larger than the server I'm running it on has memory for.
With the carrier module:
i use this:
use this function on a stream and listen to the line events that is will emit.
gr-
there is a very nice module for reading a file line by line, it's called line-reader
with it you simply just write:
you can even iterate the file with a "java-style" interface, if you need more control:
You don't have to
open
the file, but instead, you have to create aReadStream
.fs.createReadStream
Then pass that stream to
Lazy
I wanted to tackle this same problem, basically what in Perl would be:
My use case was just a standalone script, not a server, so synchronous was fine. These were my criteria:
This is a project for me to get a feel for low-level scripting type code in node.js and decide how viable it is as a replacement for other scripting languages like Perl.
After a surprising amount of effort and a couple of false starts this is the code I came up with. It's pretty fast but less trivial than I would've expected: (fork it on GitHub)
It could probably be cleaned up further, it was the result of trial and error.
Old topic, but this works:
Simple. No need for an external module.