Curious what a good load time is for large data set

Curious what a good load time is for large data set

zdwyerzdwyer Posts: 1Questions: 1Answers: 0

Hello everyone,
So my database has grown to a relatively substantial size (Around 300,000 records) and I have shifted to an AJAX call in my datatables declaration and it takes maybe 20 seconds to render in the browser. I really have no frame of reference for performance with a dataset this large and am curious if that is good or bad, and if anyone has any recommendations to improve that. JS below, returnAjaxFunction.php pulls all records and returns a JSON encoded array.

$('.tablePluginTest').DataTable(
            {
                'ajax':{
                    'url':'bin/returnAjaxFunction.php',
                    'dataSrc':''
                },
                'deferRender':true,
                'columns':[
                    {'data':'insert_child_barID'},
                    {'data':'child_nsn_number'},
                    {'data':'child_lot_number'},
                    {'data':'child_item_name'},
                    {'data':'backer'},
                    {'data':'qualified'},
                    {'data':'child_dom'},
                    {'data':'child_size'},
                    {'data':'child_model'},
                    {'data':'child_status'},
                    {'data':'acronym'},
                    {'data':'conBoxBarID'},
                    {'data':'parent_contract_number'},
                    {'data':'design_code'},
                    {'data':'insert_childID'},
                ],
                columnDefs: [
                    {
                        targets:14,
                        render: function ( data, type, row, meta ) {
                            if(type === 'display'){
                                data = '<a href="/users/insert-info?childID=' + encodeURIComponent(data) + '" class="btn btn-primary">View</a>';
                            }

                            return data;
                        }
                    }
                ],
                dom: 'Blfrtip',
                buttons: [{
                    extend: 'csvHtml5',
                    text: 'Download CSV'
                }
                ],
                "lengthMenu": [[25, 50, 100, -1], [25, 50, 100, "All"]],
                "pageLength": 25,
                "bStateSave": true
            });

Any feedback is hugely appreciated. Cheers.

Answers

  • colincolin Posts: 15,142Questions: 1Answers: 2,586

    This section of the FAQ should help, it discusses various techniques to improve performance,

    Cheers,

    Colin

  • allanallan Posts: 61,623Questions: 1Answers: 10,090 Site admin

    It also depends on stuff like:

    • How large is your JSON file? It sounds like it is going to be multiple megabytes in size (make sure your server has gzip enabled).
    • How fast is the server up link and client downlink? A LAN isn't going to have an issue with such a file, but over the internet will, particularly with mobile networks.
    • How fast is the client processor? With that size of file the JSON parsing times will become noticeable.

    I'd say the network speed is almost certainly going to be your biggest issue. The browser's network inspector will tell you how large the file is and how long it takes to download it.

    Personally I'd use server-side processing for that many rows.

    Allan

Sign In or Register to comment.