the problem with filtering large volumes of data (server-side)

the problem with filtering large volumes of data (server-side)

freemnwfreemnw Posts: 5Questions: 0Answers: 0
edited March 2011 in General
Hello!
I use DataTables with Server-side to generate large reports from the database on the server side.
To send to the server selection criteria (date, etc) I use the "fnServerData" function.
this is my code:
[code]
"fnServerData": function ( sSource, aoData, fnCallback ) {
aoData.push( { "name": "date", "value": something } );
$.getJSON( sSource, aoData, function (json) {
fnCallback(json)
});
},
[/code]
If i begin to filter the table using the "bfilter", then when i entering in the filter field a new character every time a new request is sent to the server. AND here i was faced with a problem: because of the large volumes of data, the sample from the database in one (the first entered) character occurs at the server is slower than for two, three, etc, in the end when i enter the desired phrase filter, then a few seconds i get the correct answer, but then returns the first Ajax request, in which the data is filtered only on the first character.
Goes like this: in the filter field is "123", but in fact, the data in the table is filtered by "1".
The problem is solved by sending the query synchronous method ($.ajax ({"async": false})), but it leads to "hang- up" my browser and the impossibility of fulfillment of other actions, that for me is not acceptable.
Setting delay for the filter field i am also not satisfied.
Please tell me, maybe i am doing something wrong?
How can i abort the previous Ajax request, if there has already been initiated by the new one?

Replies

  • GerardoGerardo Posts: 66Questions: 0Answers: 0
    Bind the filter to focusout instead of keyup or whatever.

    That way you can change the value of the filter without tripping to the server.
This discussion has been closed.