load data from CSV server side
load data from CSV server side
jerome_jay
Posts: 6Questions: 0Answers: 0
Hello all,
I could not find the answer to this question yet, so I tried creating it myself.
Although I'm slowly progressing toward a working goal, I feel like I'm re-inventing the wheel every line of code I write ...
So, in short:
would someone has a working code of datatable that reads data from a (rather large) CSV file on server side ? (rather large is 14.000 lines => my browser can't handle that much in DOM - the server can quite easily handle a parsing of that file every so often).
what I want:
using the basic zero-config example:
[code]
$(document).ready(function() {
$('#example').dataTable( {
"bProcessing": true,
"bServerSide": true,
"sAjaxSource": "read_csv.php"
} );
} );
[/code]
From there, I would need a working "read_csv.php" that obviously outputs a JSON format.
WARNING: I'm still new to datatables, so maybe I've got it all wrong, so let me know in that case ;)
note:
just to show I was not lazy and did the start of my homework: I've already got the basics working with the below content of read_csv.php, but as I was saying, I feel like re-doing stuff that has probably been done countless of times ... and doing it in a not-so-good way:
[code]
<?
/*
* Paging
*/
$iDisplayStart = 0;
$iDisplayLength = 9;
if ( isset( $_GET['iDisplayStart'] ) && $_GET['iDisplayLength'] != '-1' ) {
$iDisplayStart = $_GET['iDisplayStart'];
$iDisplayLength = $_GET['iDisplayLength'];
}
$iDisplayEnd = $iDisplayStart + $iDisplayLength;
$myFile = "/prod/backup/OSS/NE_DATA/NE_DATA.csv";
$myFile_data = file($myFile);
$count = 0;
$echo_block = "";
foreach ($myFile_data as &$line) {
$line = chop($line);
$line = addslashes($line);
list($VENDOR,$EM,$NETWORK,$NE_NAME,$TYPE,$VERSION,$MAA,$MAC,$NSEL,$ON_IDN,$COMMENTS,$EM_IP,$NE_IP,$GNE,$HD1,$HD2,$Region,$Country) = explode(",",$line);
$echo_line = "[ \"$VENDOR\", \"$EM\", \"$NETWORK\", \"$NE_NAME\", \"$TYPE\", \"$VERSION\", \"$MAA\", \"$MAC\", \"$NSEL\", \"$ON_IDN\", \"$COMMENTS\", \"$EM_IP\", \"$NE_IP\", \"$GNE\", \"$HD1\", \"$HD2\", \"$Region\", \"$Country\" ],";
$count++;
if ( $count > $iDisplayStart && $count < $iDisplayEnd ) {
$echo_block .= $echo_line;
}
}
$echo_block = substr_replace($echo_block,"",-1);
# add the header:
$echo_block = "{
\"sEcho\": ".$_GET['sEcho'].",
\"iTotalRecords\": \"14535\",
\"iTotalDisplayRecords\": \"14535\",
\"aaData\": [
".$echo_block;
echo "$echo_block ] }";
?>
[/code]
so i've got the paging done, and the real stuff will only start with filtering and ordering...
I could not find the answer to this question yet, so I tried creating it myself.
Although I'm slowly progressing toward a working goal, I feel like I'm re-inventing the wheel every line of code I write ...
So, in short:
would someone has a working code of datatable that reads data from a (rather large) CSV file on server side ? (rather large is 14.000 lines => my browser can't handle that much in DOM - the server can quite easily handle a parsing of that file every so often).
what I want:
using the basic zero-config example:
[code]
$(document).ready(function() {
$('#example').dataTable( {
"bProcessing": true,
"bServerSide": true,
"sAjaxSource": "read_csv.php"
} );
} );
[/code]
From there, I would need a working "read_csv.php" that obviously outputs a JSON format.
WARNING: I'm still new to datatables, so maybe I've got it all wrong, so let me know in that case ;)
note:
just to show I was not lazy and did the start of my homework: I've already got the basics working with the below content of read_csv.php, but as I was saying, I feel like re-doing stuff that has probably been done countless of times ... and doing it in a not-so-good way:
[code]
<?
/*
* Paging
*/
$iDisplayStart = 0;
$iDisplayLength = 9;
if ( isset( $_GET['iDisplayStart'] ) && $_GET['iDisplayLength'] != '-1' ) {
$iDisplayStart = $_GET['iDisplayStart'];
$iDisplayLength = $_GET['iDisplayLength'];
}
$iDisplayEnd = $iDisplayStart + $iDisplayLength;
$myFile = "/prod/backup/OSS/NE_DATA/NE_DATA.csv";
$myFile_data = file($myFile);
$count = 0;
$echo_block = "";
foreach ($myFile_data as &$line) {
$line = chop($line);
$line = addslashes($line);
list($VENDOR,$EM,$NETWORK,$NE_NAME,$TYPE,$VERSION,$MAA,$MAC,$NSEL,$ON_IDN,$COMMENTS,$EM_IP,$NE_IP,$GNE,$HD1,$HD2,$Region,$Country) = explode(",",$line);
$echo_line = "[ \"$VENDOR\", \"$EM\", \"$NETWORK\", \"$NE_NAME\", \"$TYPE\", \"$VERSION\", \"$MAA\", \"$MAC\", \"$NSEL\", \"$ON_IDN\", \"$COMMENTS\", \"$EM_IP\", \"$NE_IP\", \"$GNE\", \"$HD1\", \"$HD2\", \"$Region\", \"$Country\" ],";
$count++;
if ( $count > $iDisplayStart && $count < $iDisplayEnd ) {
$echo_block .= $echo_line;
}
}
$echo_block = substr_replace($echo_block,"",-1);
# add the header:
$echo_block = "{
\"sEcho\": ".$_GET['sEcho'].",
\"iTotalRecords\": \"14535\",
\"iTotalDisplayRecords\": \"14535\",
\"aaData\": [
".$echo_block;
echo "$echo_block ] }";
?>
[/code]
so i've got the paging done, and the real stuff will only start with filtering and ordering...
This discussion has been closed.
Replies
quick addition:
i've started doing the same from a perl script, instead of php ...
anyone would have any script already done that DOES NOT use a database ? (any language - I'll take anything !).
Having said that. 14k rows shouldn't be a problem on the client-side. Did you enable deferred rendering?
Allan
Actually, I do not have a database setup/ready yet, and getting it just for that was a bit of an overkill... thus my question.
Now that it seems I'm actually the only one trying this silly way (= I have to do all the coding for sorting/filtering), I guess it might not be such an overkill.
As for deferred rendering: from what I have read, it's "only for first display". If the user does not do a filter, the server-side will send the whole data (in fact, it's ~16k rows, 18 columns) to the browser ... Firefox handles it at the price of slowing the whole system. IE just crashes :)
Maybe I missed something with deferred rendering ? Let me know if I'm wrong.
Thanks for your advice ! that's really appreciated.
ps: what I mean by overkill was "creating the DB, creating scripts to upload the data every day, using resources to run it all the time, using some much needed disc space for the DB" ...maybe I'm just being lazy on this side and giving myself more work on the other though !
Jay
IE should be able to handle 15K records no problem. IE6 might not in fairness, but that is no great surprise I'm sure.
I'd say you are giving yourself a huge amount more work doing CSV with server-side processing and not taking advantage of a database. Yes, it absolutely is possible, but lets take a simple example of sorting by column 1 and 2, both ascending. In SQL you'd simply do `ORDER BY {col1} ASC, {col2} ASC` - on PHP you'd need loops taking into account the columns, the data types in the columns etc. Add filtering and you'd be burning a lot of clock cycles for every page request DataTables makes. I'd be interested to know what the performance of your script is like if you do fully implement it, but personally I'd just get the SQL engine to do it :-)
Allan
yes, I went a bit further and that was becoming a bit nightmarish ... my guess was to limit myself to only 1 column ordering, to avoid it.
In the meantime, I gave a try with the Oracle server-side plugin, and it works great (albeit a slight bug: http://www.datatables.net/forums/discussion/14230/slight-error-in-plug-in-server-side-processing-php-with-oracle ), since I could borrow an Oracle DB...
I'll give a try with defered rendering if I have some time, to see how it goes ... but yes, I guess you (and all the people who did not answer) were right: using a DB is much easier :)