row.add or rows.add for loop

row.add or rows.add for loop

SpimSpim Posts: 4Questions: 0Answers: 0

Hi,

I try to generate table with json file and I've problem.
On my page I generate json file after multiple user action and I save it on my server. This file is create in ajax and after it's complete I try to read json and initialise the datatable.

My file can content 1000 or 10.000 row and I want to generate one row and draw it before start the next row in the loop.
For the moment I can draw the row but I can't the the data on cell.

I don't understand my mistake and I need help because I'm lost :).

You can't see my source code

    $.ajax({
        method: "POST",
        url: url,
        data: avDatas,
        success:function(oReturn){  

            var avReturn = jQuery.parseJSON(oReturn);   

            if( avReturn.ok )       
                oTable = $("#result_table").DataTable({});                  
        },
        complete:function(){        
            $.getJSON( url ,function(oDatas){
                $.each(oDatas, function( nKey, oData ){
                    oTable.row.add([{
                        0 : oDatas[nKey]["data_1"],
                        1 : oDatas[nKey]["data_2"],
                        2 : oDatas[nKey]["data_3"],
                        3 : oDatas[nKey]["data_4"],
                        4 : oDatas[nKey]["data_5"],
                        5 : oDatas[nKey]["data_6"],
                        6 : oDatas[nKey]["data_7"],
                        7 : oDatas[nKey]["data_8"],
                        8 : oDatas[nKey]["data_9"],
                        9 : oDatas[nKey]["data_10"],
                    }]).draw(false);    
                });
            });
        }
    });     

Replies

  • allanallan Posts: 63,332Questions: 1Answers: 10,436 Site admin

    Using draw() in a loop is exceptionally bad for performance. It would cause a full redraw (resort, refilter, etc) every time around the loop, which as you say might be 10k times. That' going to kill performance.

    If you need to, break it into chunks of 1000 rows are a time and use a setTimeout to break it up a bit, but normally, I would suggest you just enable deferRender and use the entire data set in a single go.

    Allan

  • SpimSpim Posts: 4Questions: 0Answers: 0
    edited March 2018

    Thx for ur response Allan !

    I've post because I wasn't sure if I ussing best way for make it.

    I try this soon and I make you a return and if this solve my problem

  • SpimSpim Posts: 4Questions: 0Answers: 0
    edited March 2018

    Hi Allan

    If change my code like this

        oTable = $("#result_table").DataTable({
            "processing": true,
            "serverSide": true,
            "deferRender": true,
            "ajax": {
                "url": "file.json",
                "type": "POST",
                "dataSrc":""
            },
            "columns": [
                { "data": "name" },
                { "data": "slot" },
                { "data": "pon" },
                { "data": "ont" },
                { "data": "sro_nom" },
                { "data": "idur" },
                { "data": "ndi_ft" },
                { "data": "ip_iad" },
                { "data": "prise_ftth" },
                { "data": "etat" }      
            ],
            deferRender: true
        });
    

    The datable load data pretty fast but the fonctions (searh, rezise, pagination ...) don't work.

    I'vnt any error during or after generation of datatable.

    Do you have a solution ?

    Thx

  • tangerinetangerine Posts: 3,365Questions: 39Answers: 395

    You have specified "serverSide": true, which means that searching and pagination must be handled by your server-side script. I suspect that you don't have a server-side script.
    See the docs example:
    https://datatables.net/examples/data_sources/server_side.html

  • SpimSpim Posts: 4Questions: 0Answers: 0

    Hi tangerine,

    U'r right !
    I was just looking at this point and that's where my problem came from

    Thx for u'r response

This discussion has been closed.