Export to xls isn't working with some custom work in fnClick

Export to xls isn't working with some custom work in fnClick

mpatersonmpaterson Posts: 8Questions: 0Answers: 0
edited September 2011 in TableTools
I'm using infinite scrolling with datatables and have the requirement to export all records, not just the records currently loaded. As such I'm doing the following in fnClick:
[code]
"oTableTools": {
"aButtons": [
{
"sExtends": "xls",
"fnClick": function (nButton, oConfig, flash) {
var oSettings = customersTable.fnSettings();
var iInitialDisplayLength = oSettings._iDisplayLength;

oSettings._iDisplayLength = 1000;
customersTable.fnDraw();

var data = this.fnGetTableData(oConfig);
this.fnSetText(flash, data);
}
}
]
}
[/code]

When I do this using async: true then obviously, the export occurs prior to getting all of the data from the server but the export works ok. However, when I change async: false to ensure that all of my data is loaded then the export never takes place.

Any ideas as to what I'm doing wrong?

Replies

  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    Hi mpaterson,

    There are actually a couple of answers to your question, and which one is best will depend a little bit on how you are using the table. So in that vain - are you using server-side processing and infinite scrolling as a performance boost? And I presume from the above that your table only has <1000 rows? the problem with that you have at the moment is that any benefit you are getting from server-side processing is being lost when you activate the above function (since its loading the full data set there anyway).

    What my suggestion would be is to switch from infinite scrolling and server-side processing, to using Scroller for DataTables (Scroller implements virtual scrolling, so you get all the benefits of infinite scrolling + accurate scrollbars): http://datatables.net/extras/scroller/ with client-side processing, making sure you enable deferred rendering ( http://datatables.net/ref#bDeferRender ) to get as much speed as possible. In this way the default XLS export in TableTools will "simply work" :-).

    The change to the interface (which may be a disadvantage, or possibly an advantage depending on how you look at it!) would be that you aren't now using infinite scrolling - you are using virtual scrolling. If you do want to keep using infinite scrolling, then what we will need to do is set up a draw callback function inside the fnClick to find when the data has fully loaded and then do the export. If you would prefer to do it this way, let me know and I'll put the code together. However, I do suggest using Scroller for this, as I think it will make your life much easier overall, and be a lot more performant! If you'd like any help with Scroller integration, do just let me know.

    Regards,
    Allan
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    edited October 2011
    We are using server-side processing with infinite scrolling and it has worked out very well for us. Setting the display length to 1000 was just a quick and dirty way to load all of the records before exporting.

    For server-side performance reasons I don't expect that we would want to move to client-side processing.

    Thanks,
    Mike
  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    Hi Mike,

    How many rows are you looking at using? If its more than about 30'000, then server-side processing is certainly the way to go, but the problem with the current approach that you have taken is that all the "good work" of server-side processing of loading the heavy stuff to the server is effectively being undone when you do the XLS export.

    At this point the current method is a major performance hit, which has no benefit since you loose all the rows that are created and processed on the client on the next draw. So in terms of performance, Scroller would be much smoother over all. If you are using millions of rows, then yes server-side processing is very much the way to go, but the TableTools performance for millions of rows is very poor - at that point you want to create the file on the server and have it downloaded to the client.

    This is why I was suggesting switching to using Scroller (which has every benefit of infinite scrolling - plus a few more) and Ajax sourced data with deferred rendering. You are reuse your existing server-side processing script - you don't need to reimplement that, your get the benefit of the Scroller interface and TableTools will "just work" :-).

    So the question is, is this approach feasible for you? How many rows does your table hold?

    Regards,
    Allan
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    Hi Allan,

    I updated the code to use Scroller and deferred render. The scrolling functionality seems to be working ok. However, the export is only exporting 63 out of 900+ records.

    [code]
    $(function () {
    $(".localareaTabs").css("width", "100%");
    var customersTable = $("#customers-data-table").dataTable({
    "aaSorting": [[1, 'asc']],
    "bProcessing": true,
    "bServerSide": true,
    "sAjaxSource": "/Admin/Customer/CustomerTable",
    "bScrollInfinite": false,
    "bScrollCollapse": true,
    "sScrollY": "400px",
    "iDisplayLength": 100,
    "aoColumnDefs": [
    { "aTargets": [0], "bVisible": false }, /* CustomerId */
    {"aTargets": [3], "bSortable": false }, /* Dashboard Link */
    {"aTargets": [4], "bSortable": false }, /* Intacct ID */
    {"aTargets": [8], "bSortable": false }, /* Client Tier */
    {"aTargets": [11], "sClass": "numeric" }, /* OsD Revenue */
    {"aTargets": [13], "bSortable": false }, /* Services */
    {"aTargets": [16], "bSortable": false} /* View Link */
    ],
    "fnServerData": function (sSource, aoData, fnCallback) {
    applyFilters(aoData);

    $.ajax({
    "dataType": 'json',
    "type": "GET",
    "url": sSource,
    "data": aoData,
    "success": fnCallback,
    "global": false
    });
    },
    "bDeferRender": true,
    "sDom": 'lfrTtipS'
    }).fnSetFilteringDelay(250);
    [/code]
  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    Hi Mike,

    With the code above you are still using server-side processing, so the same issue will apply as you had before. If you remove this line:

    [code]
    "bServerSide": true,
    [/code]

    then DataTables will switch into client-side processing, getting the data set from your sAjaxSource script. Now, many server-side processing scripts will work just fine like this and return the full data set (if your script is based on my own PHP example for example that should work immediately). Otherwise then it will be a case of simply removing the "LIMIT" part of the SQL (which does the paging) since you want it to return the full result set.

    Regards,
    Allan
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    I see. I don't think this route will work for us as that would mean returning an unbounded result set (which goes very strongly against our rules). I think we'll have to go with what's behind door number 2! Would you mind shedding some light on how to about doing that?
  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    > that would mean returning an unbounded result set (which goes very strongly against our rules)

    That sounds like a good policy for sensitive data sets to me! But does it not also mean that your method of modifying the length parameter and then getting the data with that is also discounted, since that's basically doing the same thing?

    So, an alternative suggestion, which keeps your full data set at the server-side, would be to do create the XLS document on the server-side. To do this, what you need to do is have TableTools create a request to a PHP (or whatever) script that is going to create your XLS file for you, then have the client download it. There is a plug-in for TableTools to make the request here: http://datatables.net/extras/tabletools/plug-ins#download

    What it does is create an iframe to load the generated XLS file, which allows the browser to then download it. So the only thing that is needed is a script that will create the XLS file. Now this opens up some more advanced options for you - TableTools just creates a UTF-16LE CSV file which Excel will open no problem, and this can readily be done in your server-side script. However, it also opens some opportunities to introduce an XLS generation library, which will create a read XLS file.

    Does this approach sound somewhat more palatable?

    Regards,
    Allan
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    In regards to modifying the length parameter, this was simply a hack so we could get the functionality out. However, IE8 freezes when we return this many records. This does seem a little bit more palatable. I'll give it a try tomorrow.

    Thanks
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    Hi Allan,

    Where do I put the definition for the download button? I put it into an external file (external from TableTools.js but it does on

    [code]var oParams = this.s.dt.oApi._fnAjaxParameters(this.s.dt);[/code]

    saying that _fnAjaxParameters is not a function.
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    Nevermind. I had to update DataTables.js.
  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    Hi!

    Yes - _fnAjaxParameters was a new additional to DataTables (in fact, directly influenced by the plug-in you are using!) to make getting the set of request parameters easy. In the case you are working on generally you won't want those parameters, since you are just dumping the full data set, but it might be useful if you want to match the sorting / filtering of the table that the client has set up.

    Regards,
    Allan
  • mpatersonmpaterson Posts: 8Questions: 0Answers: 0
    Thanks for your help Allan. It's working now.

    Mike
  • allanallan Posts: 63,489Questions: 1Answers: 10,470 Site admin
    Hi Mike,

    Fantastic - that's great to hear!

    Regards,
    Allan
This discussion has been closed.