iDisplaylength not limiting records.

iDisplaylength not limiting records.

kmorriskmorris Posts: 0Questions: 0Answers: 0
edited September 2012 in TableTools
I'm using version 1.9.0 datatables with verison 2.0.0 of Tabletools to do an ajax call to retrieve records from my database.

with iDisplaylength set to 25, it is still displaying the entire result set of 5000+ records.

Is there an issue using Datatables/TableTools with Ajax calls as it pertains to iDisplaylength?

oTable = $("#example").dataTable({
"sDom": \'T<"clear">lfrtip\',
"oTableTools": {
"sRowSelect": "multi",
"aButtons": [ "select_all", "select_none" ]
},
"oLanguage": {
"sEmptyTable": "There are no records that meet the criterial"
},
"bProcessing": true,
"bServerSide": true,
"bJQueryUI": true,
"sAjaxSource": "/mypath/ajax/get_myrecords.php",
"fnServerData": function ( sSource, aoData, fnCallback ) {
/* Add some extra data to the sender */
aoData.push( { "name": "area", "value": "1234" } );
aoData.push( { "name": "emailtype", "value": $("#emailtype").val() } );
$.getJSON( sSource, aoData, function (json) {
/* Do whatever additional processing you want on the callback, then tell DataTables */
fnCallback(json)
} );
},
"iDisplayLength": 25,
"sPaginationType": "full_numbers",
"aaSorting": [[0, "desc"]],
"bStateSave": true
});

Thanks in advance!

Kevin

Replies

  • kmorriskmorris Posts: 0Questions: 0Answers: 0
    Anybody have a clue what might cause this?
  • agilitygapagilitygap Posts: 5Questions: 0Answers: 0
    Thanks for this fine plug in, Allan! Kevin, I've noticed a similar behavior with IgnitedDatatables-native-php-version, that is the native ignited Datatables library (independent of CodeIgniter). Server processing seems to ignore the iDisplayLength property. I have a temporary workaround overriding the $iLength variable at the get_paging protected function (core file) and am setting the $iLength variable in the Datatables core library file to the appropriate capping upper number to forcibly adjust (cap) the number or returned records and then the display feature at client side seems to work as long as it is less than or equal to this number.

    In my set up it this adjustment is 3rd statement in the protected get_paging function at approx line 209 in the Datatables.php library file (native-php-version of server side php file). Essentially, I am doing pseudopipelining--grabbing about 200 records at a time and then letting the datatables.js features handle sorting and pagination on the client side. For different query, it requires another XHR call and this again sets a 200 XHR generated subset which is then processed in smaller chunks at client. This is a bit of a hack, but it works well as a tradeoff between server and client processing. Possibly n1crack or Allan can figure out why the iDisplayLength is not being recognized by the server side script as expected to override the maximum records retrieved from the mysql database when used in this fashion. I'm still tweaking, will let you know if I figure it out. Cheers and thanks to the production team!
  • allanallan Posts: 63,498Questions: 1Answers: 10,470 Site admin
    It all boils down to how the server-side handles the request. To fully support server-side processing in DataTables the server must respect iDisplayLength (otherwise you might get millions of rows back!). Could either of you share a link to a page showing the issue? Just to confirm that it is something on the server-side!

    Allan
  • agilitygapagilitygap Posts: 5Questions: 0Answers: 0
    HI Allan, Here is my current and evolving test page:

    http://flightphysical.com/aaplain.php

    We have currently solved this with Pedro Alves fnLengthChange plugin which you host and post here: http://datatables.net/plug-ins/api and that function is working great with yours. Thanks again for this sweet code. I have donated in past and will continue to do so (or hire out some work) as our project evolves. Much gratitude!

    -J.O.
  • allanallan Posts: 63,498Questions: 1Answers: 10,470 Site admin
    > 'iDisplayLength': '100'

    That should be an integer not a string, is the first thing that stands out to me. The second is that it would appear that there are only 25 rows in the database and all of them are loaded. Thirdly you aren't using server-side processing here (nor, I think, do you need to with so few rows - you should only need server-side processing with 50'000+ rows).

    Allan
  • agilitygapagilitygap Posts: 5Questions: 0Answers: 0
    Thanks for looking, there are only 3600 rows in our db, and I am on the fence about SS processing for other reasons, have been toggling bServerSide true/false and comparing performance (?pseudopipelining).

    We tried iDisplayLength alternatively as string and integer (it is back to integer), and that along with the fnLengthChange seems to have resolved unexpected results, not sure if this helps OP's issue. We can confirm and did notice the first extracted subsets were being capped initially prior to the fnLengthChange plugin (eg, iDisplayLength parameter being ignored), but the combination of the fnLengthChange plugin and switching type back to int seems to have resolved the issue.

    If I can reproduce original issue more accurately, I will try to be more helpful and nail it down, meanwhile thanks for the reply and advice.
  • allanallan Posts: 63,498Questions: 1Answers: 10,470 Site admin
    edited November 2012
    If you can use Ajax loading with deferred rendering ( http://datatables.net/release-datatables/examples/ajax/defer_render.html ) I'd really recommend that. Speed advantages and no need to code the server-side script :-)

    Allan
  • agilitygapagilitygap Posts: 5Questions: 0Answers: 0
    Will do, thanks. -J.O.
  • flarpyflarpy Posts: 47Questions: 0Answers: 0
    fyi, my first attempt used server side processing because the initial data load took too long (up to 30 seconds) as the server has to collate and calculate a lot of data. However, as the data and calculations don't change very often, I cached server-side using redis (key value store like memcached but with on disk persistence) and now I have tested with up to 5000 records and it is extremely fast once the cache is primed
This discussion has been closed.