downloading/rendering large data sets - possible to prevent unresponsiveness and have progress bar?

downloading/rendering large data sets - possible to prevent unresponsiveness and have progress bar?

desslochdessloch Posts: 5Questions: 0Answers: 0
edited October 2013 in Feature requests
Probably the only hiccup datatables has for me is when I fetch a large data set and it makes the browser unresponsive while it loads everything into its array and then renders a portion of it (pagination). After that, it's blazing-fast. I usually throw at least 100,000 rows in JSON format at it. That's typically ~30MB sent over the wire, which by itself is actually bearable, because the browser remains responsive during the download. Either way, while I do have a placehoder loading notification before the table renders, the animation on it stops since the browser hangs for a while between download completion and rendering completion.

Is it at all possible right now to prevent datatables from hanging the browser during the rendering, and maybe even display the progress (X out of Y rows loaded, etc.)? Maybe something akin to threading, or to leave enough browser resources (read: not CPU/RAM); that is, free enough to allow users to do input and use other tabs?

Replies

  • mmalmeidammalmeida Posts: 15Questions: 2Answers: 0
    I have the exact same issue and question. the browser becomes unresponsive during the Datatables initialisation.
  • allanallan Posts: 57,703Questions: 1Answers: 9,195 Site admin
    How large is you data set? And how are you getting the data - chunked Ajax? Or streaming socket perhaps? 30MB in dessloch's post is amazing!

    It should be perfectly possible to use fnAddData in combination with your streamed data source to load massive tables, while keeping then page responsive.

    Allan
  • mmalmeidammalmeida Posts: 15Questions: 2Answers: 0
    Thanks for the quick reply allan.

    The dataset is about 200-300 rows. Added complexity is that each td may be one of various input types (checkboxes, text input, radioboxes), has one "+" icon and has some jquery-added formatting (background colour, on(click) for the + button, etc).
    Data is completely loaded on the backend - so the html is already loaded.
    The goal was to display the "loading" icon until the table was completely rendered. However, loading animation stops and browser becomes unresponsive. Debuging the javascript shows that this happens during the datatables initialization block.
  • desslochdessloch Posts: 5Questions: 0Answers: 0
    edited October 2013
    Thanks for replying Allan. Currently, the data set is around 100,000 rows and ~40MB, ~30 columns each, with a lot of the data being floating point numbers or long strings. I've built my web app for mostly LAN use, but it does just fine DL-ing across the net too. I have HTML5 progress bar and callbacks for that. All I'm doing right now is parsing the entire JSON once it's received and then setting aaData to it.

    That's a very interesting suggestion though, since I could easily extend my progress xhr callback to accomodate piece-by-piece parsing of the entire chunk. But I am slightly concerned about performance, as the slight pause of 2-5 seconds loading everything at once is probably better than 100,000 calls to fnAddData.

    Is using fnAddData X times after initialization comparable to setting aaData before initialization? I'm concerned also, because I've noticed that when I call fnAddData when there are already a ton of rows in the table, there's a noticeable 1-2second delay PER CALL. Now, that might go away if I set the re-render flag to false (2nd param IIRC?). But what I mean is, if I click the button 5 times really fast, it will take 5-10 seconds for all 5 to be added, and they will be added all at once. This leads me to believe loading the entire chunk at once into aaData pre-init is probably faster than individual fnAddData calls. Do you have any additional insight/benchmarks for this, before I set aside time to find out the hard way? :)

    Currently, the memory usage is actually not bad at all. It hovers around 200-300MB for Chrome and 400-500MB for FF, which is more than acceptable for my purposes. Your little plugin is the best heavy-duty-oriented JS grid I've seen so far! The app I'm writing is not really meant to be a white label (might turn into one in the future), but the responsive UX is nice to have as it's intended to be used by non-tech people.

    EDIT: Modern browsers are slight mem hogs - FF takes up 150-200MB by itself, Chrome around 70-100MB, both sitting at google.com. So actually it's more like 150-200MB total for just the page that loads/renders that dataset for both browsers. Same for Safari, too. IE obviously takes up more RAM (cause IE fails at everything), but it loads at about the same speed and is also blazing-fast-responsive once it's all in memory.
  • janiscmbpjaniscmbp Posts: 6Questions: 0Answers: 0
    edited October 2013
    Hi, just a quick note of large data sets. I have currently about 140 columns and at most 118 000 rows in a data set. It was not possible at all for me to let javasript render this monster. User does not need to see more than lets say 10-20 columns at one time, and certainly limited amount of rows. Datatables needs only to be aware of the visible data.

    I have made server side two phase cache, first for large data set and second for part of table that is visible. Worst case scenario, when parameters are changed, cache is destroyed, and about 100 columns times 118 K rows of calculation is needed (and cached), is about 20 seconds. Very acceptable. Sorting when done first time takes about 5 seconds, second time fraction of a second. Normal page load time when data comes from cache is about 0.2-0.3 seconds.
  • mmalmeidammalmeida Posts: 15Questions: 2Answers: 0
    @allan,

    I attach an example of the issue here: https://www.dropbox.com/s/s9iu90s5s0e3t94/DataTablesTest.zip

    This table is 221 rows by 24 columns. The problem happens when each row has more content than just a simple text label.

    In this case the html is already completely built (by the backend server) - unresponsiveness happens only when the Datatables element is being initialised.

    Do you have any suggestion on how to prevent this unresponsiveness?
  • allanallan Posts: 57,703Questions: 1Answers: 9,195 Site admin
    edited October 2013
    Thanks for the test case. I'll try and make some time this week to take a look. 221 rows really shouldn't be an issue - but the amount of data in each probably is since DataTables uses an innerHTML to read the data. 8.5M is a darn large HTML page...

    @dessloch and @janiscmbp - let me consume and reply to your posts separately :-) Rushing a bit atm...

    Allan
  • mmalmeidammalmeida Posts: 15Questions: 2Answers: 0
    [quote]allan said: but the amount of data in each probably is since DataTables uses an innerHTML to read the data. 8.5M is a darn large HTML page...[/quote]

    Yes, indeed! And in this scenario (where each cell can have different elements and styles), outputting something else (JSON) and relying on javascript to render it correctly would probably be worse than it already is.

    Let me know what I can do to help figure this out! I've been scratching my head over this for a while!
  • desslochdessloch Posts: 5Questions: 0Answers: 0
    edited October 2013
    [quote]mmalmeida said: Yes, indeed! And in this scenario (where each cell can have different elements and styles), outputting something else (JSON) and relying on javascript to render it correctly would probably be worse than it already is.



    Let me know what I can do to help figure this out! I've been scratching my head over this for a while! [/quote]

    I take it what you're doing is probably something like an ad-serving front-end for (potentially really dumb) customers. :)

    Yeah, there isn't really anything you can do if you're trying to keep a lot of stuff in the browser DOM all at once. The reason I can go up to 2000+ rows in pagination on my table is that most of the data is plain text/numbers, no fancy elements/images/etc. I do use inputs to edit/save back to DB, but I use inputs sparingly, removing them once the user is done editing a cell.

    The DOM tree in a browser is the biggest bottleneck/limitation to everything, as it's very inefficient at the more intensive operations (animation, moving things around, rendering of images/colors/shapes, etc, etc.). Web sites were originally intended to just be a way to share simple text information, and it'll be a while before they start becoming more like 3D engines such as Unreal/ID-Tech.
  • mmalmeidammalmeida Posts: 15Questions: 2Answers: 0
    Hi Allan!

    Did you have a chance to take a look at this?
  • allanallan Posts: 57,703Questions: 1Answers: 9,195 Site admin
    Really sorry - its on my to do list to look at it in detail. I've not forgotten! :-)
This discussion has been closed.