I'm using d3.js to plot the contents of an 80,000 row .tsv onto a chart.
The problem I'm having is that since there is so much data, the page bees unresponsive for aprox 5 seconds while the entire dataset is churned through at once.
Is there an easy way to process the data progressively if it's spread over a longer period of time? Ideally the page would remain responsive, and the data would be plotted as it became available, instead of in one big hit at the end
I'm using d3.js to plot the contents of an 80,000 row .tsv onto a chart.
The problem I'm having is that since there is so much data, the page bees unresponsive for aprox 5 seconds while the entire dataset is churned through at once.
Is there an easy way to process the data progressively if it's spread over a longer period of time? Ideally the page would remain responsive, and the data would be plotted as it became available, instead of in one big hit at the end
Share Improve this question asked Oct 6, 2013 at 8:35 swannyswanny 751 silver badge5 bronze badges 2- I'm not entirely sure what you mean, but the answer is almost certainly no -- there is no easy (or built-in) way to do this. You would have to control the rendering yourself. – Lars Kotthoff Commented Oct 6, 2013 at 11:33
- I asked a similar question to this myself. It depends on your use-case. You can retrieve data in chunks, or pre-process data into smaller set on server before loading it to client. – Phuoc Do Commented Oct 6, 2013 at 22:40
1 Answer
Reset to default 10I think you'll have to chunk your data and display it in groups using setInterval or setTimeout. This will give the UI some breathing room to jump in the middle.
The basic approach is: 1) chunk the data set 2) render each chunk separately 3) keep track of each rendered group
Here's an example:
var dataPool = chunkArray(data,100);
function updateVisualization() {
group = canvas.append("g").selectAll("circle")
.data(dataPool[poolPosition])
.enter()
.append("circle")
/* ... presentation stuff .... */
}
iterator = setInterval(updateVisualization,100);
You can see a demo fiddle of this -- done before I had coffee -- here:
http://jsfiddle/thudfactor/R42uQ/
Note that I'm making a new group, with its own data join, for each array chunk. If you keep adding to the same data join over time ( data(oldData.concat(nextChunk) ), the entire data set still gets processed and pared even if you're only using the enter() selection, so it doesn't take long for things to start crawling.