最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - AngularJS - Large sets of data - Stack Overflow

programmeradmin3浏览0评论

I've been pondering moving our current admin system over to a JS framework for a while and I tested out AngularJS today. I really like how powerful it is. I created a demo application (source: ) that has a list of 'items' and displays them in a paginated list that you can order/search in realtime.

The problem that I'm having is figuring out how I would replicate this kind of behaviour with a larger data set. Ideally, I want to have a table of items that's sortable/searchable and paginated that's all in realtime.

The part that concerns me is that this table will have 10,000+ records at least. At current, that's no problem as it's a PHP file that limits the query to the current page and appends any search options to the end. The demo above only has about 15-20 records in. I'm wondering how hard it would be to do the same thing with such a large amount of records without pulling all of them into one JSON request at once as it'll be incredibly slow.

Does anyone have any ideas?

I've been pondering moving our current admin system over to a JS framework for a while and I tested out AngularJS today. I really like how powerful it is. I created a demo application (source: https://github./andyhmltn/portfolio-viewer) that has a list of 'items' and displays them in a paginated list that you can order/search in realtime.

The problem that I'm having is figuring out how I would replicate this kind of behaviour with a larger data set. Ideally, I want to have a table of items that's sortable/searchable and paginated that's all in realtime.

The part that concerns me is that this table will have 10,000+ records at least. At current, that's no problem as it's a PHP file that limits the query to the current page and appends any search options to the end. The demo above only has about 15-20 records in. I'm wondering how hard it would be to do the same thing with such a large amount of records without pulling all of them into one JSON request at once as it'll be incredibly slow.

Does anyone have any ideas?

Share Improve this question asked Sep 10, 2013 at 11:21 andyandy 2,4093 gold badges32 silver badges51 bronze badges 3
  • Why not just synthesize 10000 records? (or just append the same 25 ones 400 times?) Then you'll see for yourself. And if that won't work you can even mix-and-match: a client side application with server-side paging and sorting. – Joachim Sauer Commented Sep 10, 2013 at 11:30
  • Sorry, I fail to grasp the point: you want to download all the record to the client or not? – package Commented Sep 10, 2013 at 11:32
  • Preferably not. I'm just trying to think of an alternative at the moment – andy Commented Sep 10, 2013 at 11:34
Add a ment  | 

2 Answers 2

Reset to default 4

I'm used to handle large datasets in JavaScript, and I would suggest you to :

  • use pagination (either server-sided or client-sided, depending on the actual volume of your data, see below)
  • use Crossfilter.js to group your records and adopt a several-levels architecture in your GUI (records per month, double click, records per day for the clicked month, etc.)

An indicator I often use is the following :

rowsAmount x columnsAmount x dataManipulationsPerRow

Also, consider the fact that handling large datasets and displaying them are two very differents things.

Indeed pulling so many rows in one request would be a killer. Fortunately Angular has the ng-grid ponent that can do server-side paging (among many other things). Instructions are provided in the given link.

发布评论

评论列表(0)

  1. 暂无评论