Real-time DOM pagination on client-side possible ?
Real-time DOM pagination on client-side possible ?
Hello,
I am currently building a JavaScript web application which uses jQuery DataTables. The data is first bootstrapped from the server into javascript objects and then for all tables in the application, the HTML table is generated as DOM and the function datatable() is applied on each of them . Very often, tables are destroyed,reconstructed, filtered, re-styled on the fly as the user plays with the various controls and all goes very smoothly. At times there are long running scripts but these are handled easily with setTimeouts applied to break up long loops.
It all goes well until one of the tables had to return 5000 rows. On recent versions of Firefox and IE, it takes approximately 7 seconds to get the datatable initialized which is quite tolerable. However, the target browser for this app is Internet Explorer 7 and after 5 seconds of processing I get the error 'stop running this script' which eventually crashes the browser.
Since the application is pure JavaScript on the client side, I cannot afford to go for a server datasource which would completely go the opposite way of a pure js client. I've also tried to optimize the tables with bSortClasses : false and bSort : false, etc, etc... to no avail. What I usually do when I get that error in iE7 is breaking the long loops with a setTimeout to let the browser 'breathe' but looking at the DataTables API, I can't seem to find where to start.
Any ideas?
I was wondering if there could be a plugin out there which does real pagination on the client side, that is, for instance, load the 100 first rows (which would take 1 split second) and then when the user choose a higher pagination number then the dom would be instantiated only when necessary. It would therefore works a bit like the server-side implementation, only that it is client-side. Any thoughts?
Thanks,
Jimmy
I am currently building a JavaScript web application which uses jQuery DataTables. The data is first bootstrapped from the server into javascript objects and then for all tables in the application, the HTML table is generated as DOM and the function datatable() is applied on each of them . Very often, tables are destroyed,reconstructed, filtered, re-styled on the fly as the user plays with the various controls and all goes very smoothly. At times there are long running scripts but these are handled easily with setTimeouts applied to break up long loops.
It all goes well until one of the tables had to return 5000 rows. On recent versions of Firefox and IE, it takes approximately 7 seconds to get the datatable initialized which is quite tolerable. However, the target browser for this app is Internet Explorer 7 and after 5 seconds of processing I get the error 'stop running this script' which eventually crashes the browser.
Since the application is pure JavaScript on the client side, I cannot afford to go for a server datasource which would completely go the opposite way of a pure js client. I've also tried to optimize the tables with bSortClasses : false and bSort : false, etc, etc... to no avail. What I usually do when I get that error in iE7 is breaking the long loops with a setTimeout to let the browser 'breathe' but looking at the DataTables API, I can't seem to find where to start.
Any ideas?
I was wondering if there could be a plugin out there which does real pagination on the client side, that is, for instance, load the 100 first rows (which would take 1 split second) and then when the user choose a higher pagination number then the dom would be instantiated only when necessary. It would therefore works a bit like the server-side implementation, only that it is client-side. Any thoughts?
Thanks,
Jimmy
This discussion has been closed.
Replies
Allan
So do you imply that I could use an Ajax datasource only for initializing these 5000 rows and then as the gets reconstructed, call the datatable() without the server options (dom only), so that the js code could be reused? Looks a bit awkward...
Thanks,
Jimmy
Thanks again,
Jimmy
> Actually I am not using an Ajax datasource, the server is basically returning a Json object which is then assigned to the aaData object
It sounds like you are already doing everything needed to use deferred rendering then - a Javascript data source will work just as well as an Ajax one (out of interest, are you loading the JSON directly in the HTML or making your own Ajax call?).
> Then multiple times, the content is built dynamically using JavaScript and the datatable is reconstructed upon that.
Why not just let DataTables do that bit for you? You've assigned the data you want using aaData (and you can add more using the API if you want), so why not just let DataTables built the rows for you?
Allan
I see what you mean but I've just given up on IE7. Sometimes after 5 seconds I get the 'stop running message'!! I've finally moved all my pagination, sorting, filtering logic to the server as well as my custom filters...using the Ajax datasource. This broke my heart but there was no way round it.
The problem I am getting now is the pagination which is screwed up a little bit.
If I return 1000 rows and the pagination is set at 30, the datatables sInfo shows: 'Showing 30 to 30 rows of 1000 entries' but the Last and Next buttons are greyed out and not clickable.
Secondly, it seems to me that the bDeferRender does nothing (I've tried true and false and both have same perf+ same results).
Thanks again,
Jimmy
Also deferred rendering will have no impact with server-side processing since the rows visible are the old rows that 'exist' - the whole point of server-side processing for only the data needed for the currently visible page should be at the client, so there can be no undrawn rows.
Allan
My questions are:
1° Is 4000-5000 (or more) rows a no go for IE7 with DOM datasource or have you heard of success stories with bDeferRender:true, bSortClasses:false on that browser?
2° How fast would 4000-5000 rows load on latest Firefox with bDeferRender set to true, DOM datasource? I was actually getting a 8 seconds wait on Firefox, is it what I should've gotten with bDeferRender true, bSortClasses:flase or shouldn't it load faster?
When I say DOM datasource, that means the html was generated by JavaScript code and injected in element, isn't it?
Thanks again for your precious help again, keep up the (incredible) good work.
Jimmy
As I say, deferred rendering is useless on DOM sourced tables since the DOM has already been generated - so you'd save nothing by enabling it (might as well use what is there).
This chart is out of date, but it will give you some idea of the Javascript speed issue in IE7: http://www.globalnerdy.com/wordpress/wp-content/uploads/2010/08/ie9pp4sunspiderjavascriptbenchmark1.jpg . IE8 is shown, and IE7 isn't, but would be 10* slower than IE8 - i.e. totally of the chart.
This is one of the many reasons that so many Javascript libraries have moved away from supporting old IE.
> 2° How fast would 4000-5000 rows load on latest Firefox with bDeferRender set to true, DOM datasource?
Deferred rendering is useless with a DOM data source. From the documentation:
> This option, when set to true, will cause DataTables to defer the creation of the table elements for each row until they are needed for a draw
But they are already drawn if you are reading from a DOM data source. See the "Data Sources" section of the usage page for more information: http://datatables.net/usage/
Allan