parsing large xml 500M with node.js

2019-03-14 09:03发布

问题:

I am using isaacs' SAX to parse a huge xml file. Also recommended by La Gentz.

The process uses about 650M of memory, how can I reduce this or allow node to use even more.

FATAL ERROR: CALL_AND_RETRY_0 Allocation failed - process out of memory

My XML file is larger than 300M it could grow to 1GB.

回答1:

You should stream the file into the parser, that's the whole point of a streaming parser after all.

var parser = require('sax').createStream(strict, options);
fs.createReadStream(file).pipe(parser);


标签: xml node.js sax