Memory Usage in Browser

Memory Usage in Browser

novice ser1novice ser1 Posts: 2Questions: 1Answers: 0

Hi,

I have just started out with datatables (great product btw)

I have followed several tutorials I have used an MVC setup, with entity framework. All working fairly well despite a quite large dataset (100k records)

There is a slight delay in opening the page, which I can live with; however what surprise me is that after loading Chrome its consuming 1.1GB of memory? This seems very excessive? Have I made a silly error, or is this about the expected amount of memory usage?

I've tried enabling server side but the performance got much worse.

Here is the code that I use to get the data:

       [HttpPost]
        public ActionResult GetList()
        {
            using (DBModels db = new DBModels())
            {
                var itemList = db.ItemsViews.ToList<ItemsView>();
                var jsonResult = Json(new {data = itemList}, JsonRequestBehavior.AllowGet);
                jsonResult.MaxJsonLength = int.MaxValue;
                return jsonResult;
            }
        }

Answers

  • allanallan Posts: 62,945Questions: 1Answers: 10,356 Site admin

    When you enabled server-side processing, did you implement server-side processing on the server-side?

    1.1GB is certainly a lot. Can you link to a page showing the issue so I can take a look at the trace in Chrome.

    Allan

  • rduncecbrduncecb Posts: 125Questions: 2Answers: 28

    What are you looking at for the memory usage value? Chrome itself taking up 1.1GB of memory is not unheard of (in fact, that's not too bad at all ;) ), the important thing to look at is the JS heap size, if that's 1.1GB I would be worried.

  • novice ser1novice ser1 Posts: 2Questions: 1Answers: 0

    Thanks for the reply. The memory usage is determined by looking at task manager. I have just run the F12 Chrome feature and a heap snapshot after the page has run is coming back as 263mb
    I have tried tweaking bits of the JS to see if that would make a difference and removing the conditional formatting and hyperlink portion did reduce it by 200mb.

    For reference here is the mark up code:

        <script>
            $(document).ready(function () {
                $('#itemTable').DataTable(
                {
                    "serverSide": false,
                    "ajax": {
                        "url": "/Item/GetList",
                        "type": "POST",
                        "cache": false,
                        "datatype": "json",
                        "deferRender": true
                    },
                    "columns": [
                        { "data": "type" },
                        { "data": "object_id" },
                        {
                            "data": "object_number",
                            "render": function (data, type, row, meta) {
                                if (type === 'display') {
                                    data = '<a target="_blank" href="http://Framework/Object.aspx?o=' + row.object_id + '&t=3">' + data + '</a>';
                                }
    
                                return data;
                            }
                        },
                        { "data": "object_ver" },
                        { "data": "object_desc", "name": "test" }
                    ],
                    "columnDefs": [
                        {
                            "targets": [0],
                            "visible": false
                        },
                        {
                            "targets": [1],
                            "visible": false
    }
                    ],
                    rowCallback: function (row, data, index) {
                        if (data.type === "sdocument") {
                            $("td:eq(2)", row).css('color', '#2ebf32');
                        }
                        else if (data.type === "spart") {
                            $("td:eq(2)", row).css('color', '#af0c0c');
                        }
                        else if (data.type === "schange") {
                            $("td:eq(2)", row).css('color', '#146df2');
                        }
                        else if (data.type === "sserial") {
                            $("td:eq(2)", row).css('color', '#828891');
                        }
                    }
    
                });
            }
                );
        </script>
    }
    

    As for the server side, I did try. I got it to work but on the same dataset it was taken 28 seconds to change pages, filtering was even longer. Not sure if that just down to rubbish C# code, again here is what I’m using:

                    [HttpPost]
                    public ActionResult GetList()
                    {
    
                        //server side parameters
                        int start = Convert.ToInt32(Request["start"]);
                        int length = Convert.ToInt32(Request["length"]);
                        string searchValue = Request["search[value]"];
                        string sortColumnName = Request["columns[" + Request["order[0][column]"] + "][name]"];
                        string sortDirection = Request["order[0][dir]"];
    
                        List<ItemsView> itemList = new List<ItemsView>();
    
                        using (DBModel db = new DBModel())
                        {
                            itemList = db.ItemsViews.ToList<ItemsView>();
                            int totalRows = itemList.Count;
    
                            if (!string.IsNullOrEmpty(searchValue))  // conduct the filter operation
                            {
                                itemList = itemList.
                                    Where(x => x.type.ToString().Contains(searchValue.ToLower())  ||
                                    x.object_number.ToString().Contains(searchValue.ToLower()) ||
                                    x.object_desc.ToString().Contains(searchValue.ToLower())).ToList<ItemsView>();
                            }
    
                            int totalRowsAfterFiltering = itemList.Count;
    
                            // sorting
                            itemList = itemList.OrderBy(sortColumnName + " " + sortDirection).ToList<ItemsView>();
    
                            // paging
                            itemList = itemList.Skip(start).Take(length).ToList<ItemsView>();
    
                            return Json(new { data = itemList, draw = Request["draw"], recordsTotal =  totalRows, recordsFiltered = totalRowsAfterFiltering }, JsonRequestBehavior.AllowGet);
                        }
    
                    }
    

    Regrettably I’m developing for an intranet so I cannot share the ‘live’ site

  • allanallan Posts: 62,945Questions: 1Answers: 10,356 Site admin

    I think the issue with the server-side processing performance is that you are using Linkq there, which, by my understanding reads all of the data from the data source (database perhaps?) before it can do its ordering / search / etc. So you are basically getting zero benefit from server-side processing. You'd only really get the benefit if you applied the conditions to the data source directly.

    Regarding the memory usage - that sounds like more or less what is expected then.

    Allan

This discussion has been closed.