1.10 Ajax for small data set in memory
1.10 Ajax for small data set in memory
I am using 1.10 successfully to read data from AJAX into a Javascript array (not Datatables), I do a little processing on the the data and then use Datatable's row.add() to create rows in the table. It works great until I hit about 4000 rows, then it starts to get sluggish loading. I would like to essentially keep a small set of about 500 rows in memory and then swap them out with new data as I scroll (using Scroller) through the table making ajax calls with page number/page size offsets to the server.
I saw only very basic examples on the web page. Are there any others that I can be pointed to that show more complex situations? I also use the 'createdRow' parameter to add buttons and links to my tables. One of columns created using row.add() is a hidden column that has an object (not simple text) that createdRow uses to help render the row. I would need to be able to keep doing that.
Does Scroller have some form of callback or event fired that indicates it is near the end or the beginning of the data? I would like to constantly load data as I scroll in either direction. I am excited about using basically unlimited data, I just need a nudge in the right direction.
This question has an accepted answers - jump to answer
Answers
Server-side processing is how that would be implemented. You would use
ajax
as a function to make the required calls to get the data.However, I would like to know if we can resolve the performance issue. It shouldn't be a problem until around 50k rows, depending upon your configuration and options of course. Can you link to a page showing the issue so I can check it. I assume also you have
deferRender
enabled and are not shows all rows at once?Allan
Unfortunately the code is in an environment that is not exposed to the outside world so I can't provide a working link to it. I can show you snippets of the code.
I am not showing all rows at once. I did put in the scroller plugin.
Here is the table initialization:
Here is the listTableRow function:
The function above renders each row, I need to add data attributes, attach classes, put in icons with links, etc. column 0 is hidden and has the complete data object for the row.
Here is how I stuff the table after loading an array from Ajax:
The ajax call stuffs GC.data.list array and after the table is initialized I call the loadList function.
I know I am creating a unnecessary step by loading the array and then looping through the array to add rows to the table. I want to eliminate that.
If I return 50K rows, it takes up to two-three minutes for the table to load and it locks the browser so much that the UI is unresponsive and animated GIFs to indicate loading freeze up.
I am reading JSON from the server and it is converted to a javascript array. 50K rows is about 30 Megs of data coming over the wire (based on looking at the traffic on Fiddler). I wonder if reducing the field name lengths in JSON would improve performance. I have 21 field names that have a combined field length of 359 bytes, so
359 X 50K = 17.95Megs of data. If I returned a single character for each field name, 359-21 = 338 bytes saving or 16.9M that didn't need transmission or processing.
I am going to read more on the site about server side loading, but the maximum number of rows I might ever see is 100K. If I can get that down to 10 seconds for loading and avoid paging off the server, that would be the best solution. Then I can use the sorting and filtering client side.
I think I am in that sweet spot where it could go either way (pure client side or server side). I would prefer one load and then allow the user to manipulate the whole table in memory.
Wow - that is a lot of data for each row. Do you have gzip compression enabled on the server? 30M of data just to show a table is a lot. My mobile provider wouldn't be happy :-) (although less of a concern if LAN access only).
You could try using arrays rather than objects to eliminate any parameter name overhead. That is the biggest (only?) disadvantage of using objects.
Once the data has been loaded (which you can see how long that takes using your browser's networking inspector) does the DataTables initialisation happen reasonably quickly? The slowest part of the code above will be the scrolling - that is always a performance drag.
Allan
I reduced the JSON field name length to one character. Based on the average record size when converted to JSON on the server, I went from an average of 609 to 310 bytes per row (in a JSON string complete with "s and {}s) that is nearly a two-fold reduction in data coming over the wire. I believe the max I will ever see is about 85K rows. I would like to get it all in memory without having to page in from the server. In the sample code above. (a snippet is shown here:)
I can ditch the object in column zero. I will take the fields I am using to build links and buttons and put that information in hidden columns to flatten out the data structure. Since your 'createdRow' passes the column info in the 'data' parameter, I can use the hidden columns to get the data I was getting from the object in column 0.
So I have flattened out the data structure, I still am creating an array from an ajax download and then adding rows to the table data from that array. Basically looping twice.
Is there a way to combine that step, get the data from ajax and take the JSON row object and stuff the table data object directly?
I would need to be able to combine data like taking the first and last name values and combine them together into a single column before putting them in the table. Would that
be a callback function combined with a row add?
I am not going to use the scroller extension because I have Bootstrap popovers available on some elements of the table and the scroller moves the rows without moving the popover. I will stick with paging and filtering.
It should be possible with the
columns.data
option. You would just tell DataTables what property to read from the data object given for each row. Then you can userows.add()
to feed the array of data in.Editor does that using
columns.render
in many of its examples although this is post processing once the data has loaded, not before.Allan
I changed up my code as mentioned above. For 68K rows (on Chrome), it takes about 2-2.5 minutes to load and be available. On IE 11 it took about 3.5 minutes on the same computer. The network component (the download) is 5-6 seconds. This is not acceptable to the end users. If there is a way to get the total time to under 30 seconds, then that would be the best, if not then I think I need to use Ajax to page in the data.
Can I page in 1-2K rows at a time and use the paging mechanism? I can't really do server side processing and infinite scrolling. I can send parameters to the server to get back filtered information.
Not really. And almost certainly not if you can't use server-side processing. You can add rows dynamically using the API, but I think that would hurt usability and performance in this case.
Can you give me a link to the page so I can performance profile it and see why on earth it is taking so long?
Thanks,
Allan
I can't link to the page. Tell me how you would run the performance profile and I will run it and share the results.
Update: I can format the data from the server in any way I want. I originally needed to get the data from a JSON REST service and use the JSON data, but not anymore. I can create a web page on the server, directly read from the database and output the data in any format necessary from an Ajax call to the web page to be used by Datatables.
You mentioned about using arrays instead of objects. I eliminated the need for objects in any of my columns. I do have hidden column data that I need to render links --- things like ID numbers that aren't displayed in the table --- but they are simply text. I can combine the data (like people's first and last names) into a single column on the server so there will be no need to do client side processing of the underlying table data. I still need the createdRow function to change the look of the row (show buttons and links). But I use the 'deferredRender' flag, so that should only be the visible rows
I would still like to have everything in memory if possible. I am certain it will never grow beyond 100K rows.
I appreciate the information you have given me so far...
The Chrome developer tools include performance monitoring tools. I would suggest using them. If you haven't used them before, there are loads of articles that you will be able to find on Google for them.
Allan
I have read the Ajax section and I am going to the create the following JSON object from the server:
That should be the fastest way to get all the information into the table at once.
I currently have two columns that have select drop down filters that have a sorted list to allow filtering. How can I get at that info if the JSON object is directly consumed by DataTables? I would like to do something like this:
I would prefer to build and sort the filter list on the server as the rows are being constructed, that way I don't have to scan the table data on the client.
How can I access the column2filter and column5filter arrays to build the select drop down?
Use
ajax.json()
orinitComplete
which passes the JSON in as a parameter.Allan
Thank you very much Allan. I think I have all that I need on this question. I think you have a wonderful plugin. I greatly appreciate the support you have given me on this question.