Am I Optimizing My Initial Data Load?
Am I Optimizing My Initial Data Load?
Let me start by saying that dataTables is working perfectly and this is a question of inexperience that I'm hoping somebody can spare some time and educate me.
I work for a public school system, and I'm using dataTables to load student enrollment for each school, each school level (high school students, middle school students), and the overall district. I'm using JSON to get the data from a SQL Server via asp.net web service. Initial load for an individual school (approximately 1,500 students/records) takes about 5 seconds, which I consider acceptable. For the entire district (approximately 100,000 students), it takes about 60 seconds. Considering the number of records, I still don't think that's too bad, and I have a nice "Loading" message explaining that length of time it might take to load the data.
Still, I'm wanting to verify that the way I am loading the records into the dataTable is indeed the most efficient way to be doing it.
Here's my table setup in html:
<div id="divStudents" style="width: 95%; margin-left: auto; margin-right: auto">
<table id="tblStudents" class="table table-striped table-bordered table-condensed display" style="width: 100%">
<thead>
<tr>
<th>Select</th>
<th>Student Name</th>
<th>Enrollment Date</th>
</tr>
</thead>
<tbody></tbody>
</table>
</div>
And here's my JavaScript that retrieves the data and loads into dataTables:
function loadActiveStudents() {
//Now load the bouncing ball...Please Wait :)
initialLoadGif('#divAlert');
$.ajax({
type: "POST",
contentType: "application/json; charset=UTF-8",
url: "WebServices/myStudentProfile.asmx/getActiveStudents_Unformatted",
data: JSON.stringify({ pageID: pageID }),
dataType: "json",
cache: false,
success: function (response) {
var data = response.d;
if ($.fn.DataTable.isDataTable('#tblStudents')) {
$('#tblStudents').DataTable().clear().destroy();
}
$('.lblPop').html('Selected Enrollment: ' + formatNumberWithCommas(data.length) + ' Students');
if (data.length > 0) {
var secBlur = "secBlur";
var dt = $('#tblStudents').DataTable({
paging: false,
info: false,
processing: true,
language: {
"loadingRecords": "Please wait - loading..."
},
scrollY: 475,
data: data,
columns: [
{
data: "select",
},
{
className: secBlur,
data: "studentName"
},
{
data: "enrollmentDate"
}
],
dom: '<"col-xs-12 col-sm-6 col-md-4"f><"col-xs-12 col-md-4">Brt',
buttons: [
{
extend: 'copyHtml5',
exportOptions: {
columns: [1, 2]
}
},
{
extend: 'excelHtml5',
exportOptions: {
columns: [1, 2]
}
},
{
extend: 'pdfHtml5',
exportOptions: {
columns: [1, 2]
}
},
{
extend: 'print',
exportOptions: {
columns: [1, 2]
}
},
],
});
}
$('#divLoader').remove();
}
})
}
Anything I can tweak to make the load more efficient?
This question has an accepted answers - jump to answer
Answers
Remove that and add:
The reason is that having paging disabled means that all 100'000 rows need to be rendered straight away - even those way at the bottom of the page. But its not just the rows, its the cells in the rows, and anything inside the cells.
Enabling paging and deferred rendering is the single action that will get you the biggest performance win.
Beyond that, rather than using objects for the data objects (for each row), use an array. That will save you a significant amount of bandwidth on larger tables - for example, for every row, the JSON includes
"select":
,"studentName":
and"enrollmentDate"
.Use arrays instead and you will save all that bandwidth and transfer time! The downside is that it isn't quite as easy to understand the data by inspection since you need to know what the array indexes refer to.
Finally - make sure you have gzip enabled for your web-server to make transfer times as low as possible.
Allan
Thank You Sir!!
How much difference did it make, out of curiosity?
Allan
Today just happens to be our first day of school, for 98,135 records, the load time dropped from 58 seconds down to 6!!