Server Side Processing ASP.NET C# how to optimise with around 1 million records
Server Side Processing ASP.NET C# how to optimise with around 1 million records
Hello,
I have been following this tutorial to enable server side processing for datatables.
On the client side my code looks like this:
$('#exampleTable').DataTable({
"processing": true,
"serverSide": true,
"ajax": {
"url": "@Url.Action("DataTablesProcessing","INVs")",
"type": "GET",
"error": function (xhr, status, error) {
console.log(xhr.responseText);
}
},
"columns": [
{ "data": "YEAR" },
{ "data": "ORDNO" },
{ "data": "CURR" },
{ "data": "SUMMEDAMT" }
]
});
On the server side, I have one linq query like this:
public List<SpecialClass> GetYTDTRNSGroupBySum()
{
var result = db.YTDTRNS.GroupBy(y => new
{
y.ORDNO,
y.CURR,
y.Year
}
).Select(z => new SpecialClass
{
YEAR = z.Key.Year,
ORDNO = z.Key.ORDNO,
CURR = z.Key.CURR,
SUMMEDAMT = z.Select(a => a.AMOUNT).Sum()
}
);
return result.ToList();
}
The result of this linq query is around 300k records.
The function which returns JSON data is currently written this way:
List<SpecialClass> listForTrnsRecords = new List<SpecialClass>();
listForTrnsRecords = GetYTDTRNSGroupBySum();
// construct the json representaition
TO_DATATABLESJS dataTableJsData = new TO_DATATABLESJS();
dataTableJsData.draw = draw;
dataTableJsData.recordsTotal = listForTrnsRecords.Count();
dataTableJsData.recordsFiltered = listForTrnsRecords.Count();
dataTableJsData.data = listForTrnsRecords;
var jsonResult = Json(dataTableJsData, JsonRequestBehavior.AllowGet);
jsonResult.MaxJsonLength = int.MaxValue;
// return the json representaiton
return jsonResult;
Currently, I have no problem if I deliberately use it on smaller datasets.
However, when I run this program on chrome, with that linq query result set of 300k, chrome will seemingly run the program(there will be the empty table with the word 'processing' displayed) until it finally crashes and display the "aw snap" page.
I am not sure if it the way I have written my code or my linq query that makes it not 'scalable' enough for the program to load correctly. Or is it I have to work on database side of things, like I have use indexing on the columns or other technique to make the program work smoothly. In the future, I even hope to achieved to display data with result set of 1 million records.
Appreciate any guidance
Thanks!
This question has accepted answers - jump to:
Answers
I'm not familiar with your linq code but you need to build your DB queries to follow the parameters sent when using server side process. The documentation is here:
https://datatables.net/manual/data/
With server side processing the number of expected records returned is equal to the page length which is default 10. Your server side script is responsible for limiting the number of records returned (10 by default), starting at the correct offset (table page) and ordering the data correctly.
This SO thread may help understand limit, offset, etc.
Kevin
My understanding is that the code in that tutorial (I'm not certain as I didn't write it - you'd need to contract the original author) will pull all of the information out of the database before it can perform the LINQ commands, resulting in the out of memory error. That basically nullifies any benefit you might have got from server-side processing. A direct SQL implementation would perform much better.
Allan