I'm using d3.js to plot the contents of an 80,000 row .tsv onto a chart.
The problem I'm having is that since there is so much data, the page becomes unresponsive for aprox 5 seconds while the entire dataset is churned through at once.
Is there an easy way to process the data progressively if it's spread over a longer period of time? Ideally the page would remain responsive, and the data would be plotted as it became available, instead of in one big hit at the end
I think you'll have to chunk your data and display it in groups using setInterval or setTimeout. This will give the UI some breathing room to jump in the middle.
The basic approach is: 1) chunk the data set 2) render each chunk separately 3) keep track of each rendered group
Here's an example:
You can see a demo fiddle of this -- done before I had coffee -- here:
http://jsfiddle.net/thudfactor/R42uQ/
Note that I'm making a new group, with its own data join, for each array chunk. If you keep adding to the same data join over time ( data(oldData.concat(nextChunk) ), the entire data set still gets processed and compared even if you're only using the enter() selection, so it doesn't take long for things to start crawling.