How can I make a really long string using IndexedD

2019-01-23 01:36发布

I'm writing a web app that generates a potentially large text file that the user will download, and all the processing is done in the browser. So far I'm able to read a file over 1 GB in small chunks, process each chunk, generate a large output file incrementally, and store the growing output in IndexedDB. My more naïve attempt which kept all the results in memory and then serialized them to a file at the very end was causing all browsers to crash.

My question is two-fold:

  1. Can I append to an entry in IndexedDB (either a string or an array) without reading the whole thing into memory first? Right now, this:

    task.dbInputWriteQueue.push(output);
    var transaction = db.transaction("files", "readwrite");
    var objectStore = transaction.objectStore("files");
    var request = objectStore.get(file.id);
    request.onsuccess = function()
    {
        request.results += nextPartOfOutput
        objectStore.put(request.results);
    };
    

    is causing crashes after the output starts to get big. I could just write a bunch of small entries into the database, but then I'd have to read them all in to memory later anyway to concatenate them. See part 2 of my question...

  2. Can I make a data object URL to reference a value in IndexedDB without loading that value into memory? For small strings I can do:

    var url = window.URL.createObjectURL(new Blob([myString]), {type: 'text/plain'});
    

    But for large strings this doesn't jive too well. In fact, it crashes before the string is loaded. It seems that big reads using get() from IndexedDB cause Chrome, at least, to crash (even the developer tools crash).

Would it be faster if I was using Blobs instead of strings? Is that conversion cheap?

Basically I need a way, with JavaScript, to write a really big file to disk without loading the whole thing into memory at any one point. I know that you can give createObjectURL a File, but that doesn't work in my case since I'm generating a new file from one the user provides.

2条回答
对你真心纯属浪费
2楼-- · 2019-01-23 02:05

Storing a Blob will use a lot less space and resources as there is no longer a need for conversion to base64. You can even store "text/plain" objects as blobs:

var blob = new Blob(['blob object'], {type: 'text/plain'});
var store = db.transaction(['entries'], 'readwrite').objectStore('entries');

// Store the object  
var req = store.put(blob, 'blob');
req.onerror = function(e) {
    console.log(e);
};
req.onsuccess = function(event) {
    console.log('Successfully stored a blob as Blob.');
};

You can see more info here: https://hacks.mozilla.org/2012/02/storing-images-and-files-in-indexeddb/

Chrome has supported this only since summer of 2014: http://updates.html5rocks.com/2014/07/Blob-support-for-IndexedDB-landed-on-Chrome-Dev so you cannot use this on older versions of Chrome.

查看更多
老娘就宠你
3楼-- · 2019-01-23 02:12

I just reopened the Chrome bug which I submitted 2 years ago and created another bug for the FF team, related to the browser crash when creating a large blob. Generating large files shouldn't be a issue for the browsers.

查看更多
登录 后发表回答