Maximum rows without server-side processing

Maximum rows without server-side processing

thowithowi Posts: 67Questions: 7Answers: 0
edited January 2014 in General
Hi all together,

I just exceeded the limits of what my server / data tables (?) if capable of.
With round about 26000 rows, datatables is working well. Above this limit, the script will not work any more.

Of course I'd like to exceed this limit somehow. In one of the closed topics here I read that someone used datatables with 10mio rows. Thats a limit I'd like :)

Currently, I'm receiving data while doing a MySQL SELECT statement, sending the data to a numerical array and building up the table from that array at the end.

I think this maybe a problem of server RAM, because sometimes I get an error like that: Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 4495885 bytes)
When having around 26000 rows in my database, I receive this message and the datatable with its content is still working, but the rest of the site crashes. When adding some more rows, only the jQueryUI tabs are working - without any content (datatable is in one of some jQueryUI tabs).

Do you have an idea what I can do here to solve this issue?

Regards,
thowi


[code]<?php
$content = $db->get_results("SELECT * FROM table",ARRAY_N); // Returns a numerical array with content of SELECT statement
?>

var viewdbtabs3 = $('#tabs-3 table.display').dataTable({
"aaSorting": [ [0,'asc'], [8,'desc'] ],
"sScrollY": "300px",
"sScrollX": "100%",
"bScrollCollapse": true,
"oLanguage": {
"sSearch": "Global search:"
},
"bJQueryUI": true,
"sPaginationType": "full_numbers",
"sDom": '<"H"l<"clear">Tfr>t<"F"ip>',
//"oSearch": {"bSmart": false},
"oTableTools": {
"sSwfPath": "modules/rfid/tools/jQueryUI/datatable/media/swf/copy_csv_xls_pdf.swf",
"aButtons": [
"copy", "csv", "xls", "pdf"/*,{
"sExtends": "collection",
"sButtonText": "Save",
"aButtons": [ "csv", "xls", "pdf" ]
}*/
]
}
});






Col1
Col2
Col3
Col4
Col5
Col6
Col7
Col8
Col9
















<?php
for($i=0; $i < count($content); $i++){
echo '';
for($j=0; $j < count($content[0]); $j++){
if ($content[$i][$j] != ''){
echo '' . $content[$i][$j] . '';
}
else{
echo 'Update DB!';
}
}
echo '';
}
?>



[/code]

Replies

  • allanallan Posts: 63,161Questions: 1Answers: 10,406 Site admin
    Sounds like a PHP limit error rather than a DataTables one. Perhaps try increasing the amount of memory that you PHP process can run in? Although its already fairly high.

    The other thing to do would be to optimise your script - at the moment it is dumping the entire SQL result into an array, which isn't really necessary I wouldn't think. Just loop over the result object, getting a row at a time, which will be far more memory efficient. How exactly you do that will depend on your PHP scripts.

    > I read that someone used datatables with 10mio rows. Thats a limit I'd like :)

    I've seen it used with 20 million rows, but that was with server-side processing!

    Allan
  • thowithowi Posts: 67Questions: 7Answers: 0
    Hi Allan

    Thanks for your thoughts!
    [quote]Just loop over the result object, getting a row at a time, which will be far more memory efficient.[/quote]

    So you mean instead of doing one single SQL fetching all the data id would be better to split this query into more ones? Okay, sounds quite simple - I'll give that a try!

    Also the option for server-side processing is quite interesting. I will also give this example a try: http://datatables.net/release-datatables/examples/server_side/object_data.html

    Regards,
    thowi
  • allanallan Posts: 63,161Questions: 1Answers: 10,406 Site admin
    > would be better to split this query into more ones?

    Not quite. If you are using PDO it look like you are using fetchAll, while you could try using fetch - http://php.net/manual/en/pdostatement.fetch.php .

    Allan
  • thowithowi Posts: 67Questions: 7Answers: 0
    Ho I'm not using PDO - I'm using ezSQL class - there seems to be a similar one if I understand you right: get_row() fetches only one row at a time or previously cached results. I'll do some tests with that one and report back if it'll work better ;D
This discussion has been closed.