DataTable loads 9,045 rows just fine but not 9,046

DataTable loads 9,045 rows just fine but not 9,046

barbedCoilbarbedCoil Posts: 8Questions: 4Answers: 0

I have a mysql table with 25,000 rows

I am using Sinatra to access the mysql data (using sequel ruby gem) using a RESTful api and return it in json format

When I tell Sinatra to return number of rows up to 9,045 it works great and loads extremely fast but when I tell Sinatra to return 9,046 rows then I get this error from DataTable:

DataTables warning: table id=grid - Invalid JSON response. For more information about this error, please see http://datatables.net/tn/1

The total size of the data is about 3.1 mb for the 9k rows and about 8.x mb for all rows - chrome loads all data just fine if I just navigate to the REST url but datatable dies after the amount I described above.

The interesting thing is that I followed the instructions in the url for the error and when I just run the the url from chrome for all rows it returns them and I can see them in the dev tools for chrome

I have validated the returned .json using the suggested tools and it is perfectly valid.

Is there some local storage issue that I need to increase in order to load more data?

Any thoughts or help wold be welcome.

Tab

This question has accepted answers - jump to:

Answers

  • allanallan Posts: 61,734Questions: 1Answers: 10,110 Site admin
    edited June 2014 Answer ✓

    When is being returned from the server when you get the above error - given that it is not JSON (that error message is coming from jQuery's parser)? You can follow the instructions in the tech note linked in the error message to dermic what is being returned (edit Sorry I see that you did follow the instructions - but regardless, if that error is being shown, then the returned data is not valid and there must be something in the data making it not valid - we need to know what that is).

    My guess is that it is an out of memory error and you need to modify your server-side script to be more memory efficient or use server-side processing.

    Allan

  • barbedCoilbarbedCoil Posts: 8Questions: 4Answers: 0
    edited June 2014

    Thanks, I ended up having to dump the entire 25,000 rows into a json doc and validated with SublimeText editor - found the issues - double quotes within double quotes

    The online json linters/validators just could not handle the data set size

    Now loads all rows in a couple of seconds - totally acceptable given the data size

    Thanks again, I really appreciate you taking the time to give me a mental nudge!

  • allanallan Posts: 61,734Questions: 1Answers: 10,110 Site admin
    Answer ✓

    Excellent - good to hear you found the solution :-)

    Allan

This discussion has been closed.