How do I do random access reads from (large) files

2019-04-09 12:44发布

问题:

Am I missing something or does node.js's standard file I/O module lack analogs of the usual file random access methods?

  • seek() / fseek()
  • tell() / ftell()

How does one read random fixed-size records from large files in node without these?

回答1:

tell is not, but it is pretty rare to not already know the position you are at in a file, or to not have a way to keep track yourself.

seek is exposed indirectly via the position argument of fs.read and fs.write. When given, the argument will seek to that location before performing its operation, and if null, it will use whatever previous position it had.



回答2:

node doesn't have these built in, the closest you can get is to use fs.createReadStream with a start parameter to start reading from an offset, (pass in an existing fd to avoid re-opening the file).

http://nodejs.org/api/fs.html#fs_fs_createreadstream_path_options



回答3:

I suppose that createReadStream creates new file descriptor over and over. I prefer sync version:

function FileBuffer(path) {
const fd = fs.openSync(path, 'r');

function slice(start, end) {
    const chunkSize = end - start;
    const buffer = new Buffer(chunkSize);

    fs.readSync(fd, buffer, 0, chunkSize, start);

    return buffer;
}

function close() {
    fs.close(fd);
}

return {
    slice,
    close
}

}



回答4:

Use this:

fs.open(path, flags[, mode], callback)

Then this:

fs.read(fd, buffer, offset, length, position, callback)

Read this for details:

https://nodejs.org/api/fs.html#fs_fs_read_fd_buffer_offset_length_position_callback