Introducing SearchBuilder

Introducing SearchBuilder

allanallan Posts: 52,856Questions: 1Answers: 8,117 Site admin

I'm delighted to announce the first release of our latest extension for DataTables: SearchBuilder.

SearchBuilder allows end users to quickly build up well structured search conditions so they can find data in a table based on the query they set up. You can also set up an initial query for them, define a query via the API and use state saving.

One thing SearchBuilder doesn't yet do is operate with server-side processing. That is something we will be looking to add into a future version.

SearchBuilder is available on our CDN via our download builder or direct file access, also on NPM / Yarn, NuGet, Composer and bower (see the download builder for details). The manual is available here including details on how to create your own plug-ins for custom search types, and a full set of examples are also available.

As always, feedback is most welcome! If you have any suggestions, please post them here. If you run into any issues, please post in a new thread with a test case, so we can keep track of individual issues.

Enjoy!
Allan

Replies

  • setwebmastersetwebmaster Posts: 69Questions: 5Answers: 0

    Hey there, just dropped in to say this really seems interesting!

    As of the initial testing I have made with the provided samples, it works really well!
    I would have one comment that I'd like to discuss on to see if there are performance/efficiency optimizations that could be done.

    Using the bootstrap 4 example:
    1. Create a first search criteria for example with the Name starting with the letter b
    2. Then go on and add a second criteria on the "position" column, set the "condition" to "Equals"

    This results in the different available values being loaded dynamically for the position field, but it loads all the values initially available in the whole dataset. Wouldn't it be wiser to create a kind of "data subset" in which the criteria are sequentially applied and would be used to populate the available values for subsequent filters?

    This "problem"/performance hit can be noticed easily when doing the same scenario with the 50K rows in which case the delay is actually really noticeable (over 2 seconds of delay for some filters in my case, with pretty decent cpu Ryzen 5 3600). To me, the further we add filters, the faster it should get as the resulting data set only gets smaller and smaller.

    (Again, this is similar to the comment I made when Searchpanes initially launched regarding the "cascading" feature :smile: )

    TBN: I am totally not complaining here hehe, just wondering if what I proposed above would make it even more efficient/faster (trading some storage for speed, obviously nothing has a 0 cost in life :wink: )

  • sandysandy Posts: 406Questions: 0Answers: 114

    Hi @setwebmaster ,

    Thanks for the feedback! We did discuss adding some functionality along these lines, but decided not to implement it in this version and wait to see if there was any feedback on the subject. A couple of the problems that we could see are as follows.

    You can only apply subset filtering to groups with and logic in place, or logic simply can't apply subset filtering. Further to this, if you have a top level group with or logic, any sub-groups of it with and logic will need to have their own unique subsets and that is potentially very process heavy!

    Continuing along those lines, it gets increasingly complex the more levels that you add, as these could all contain different logic, as well as any number of criteria. You would have to filter the entire dataset for every group to work out what is available for it's subgroups and so on. Which I can see getting messy!

    If there is more demand for some functionality along these lines then we can look into it further.

    Thanks,
    Sandy

  • setwebmastersetwebmaster Posts: 69Questions: 5Answers: 0

    @sandy thanks for the quick reply. You are totally right, it gets way more complex than the initial implementation, which may make it even worse performance wise in the end. Maybe the solution is just to be efficient when defining the columns with which the search builder can work with (eg. avoiding columns which contain tons of different values, using the searchBuilder.columns option)

  • sandysandy Posts: 406Questions: 0Answers: 114

    Hi @setwebmaster ,

    Yes that would be the correct path to take if you have a column with a lot of unique values that you don't want to apply filtering to.

    It is also possible to modify the conditions. You could then run an equals condition using an input element rather than a select element. Equally this could be done to any of the conditions, and they can be modified in any way. I'd recommend reading through this page to anyone wishing to implement this.

    Thanks,
    Sandy

  • d00mboyd00mboy Posts: 4Questions: 1Answers: 0

    Hello

    I'm trying to use search builder, the select all and select none buttons, export buttons, and editor together. If I specify search criteria so that a subset of table rows are displayed, then click select all, and open with button in Editor for multi-edit, I am editing the entire table (without the applied search criteria). If I instead use export button and then Copy or XLSX options, only the rows from the search are included (this is the expected behavior). I'd like editor to also only be working with the results of the search.

    Ted

  • sandysandy Posts: 406Questions: 0Answers: 114

    Hi @d00mboy ,

    Could you please create a new forum post so that we can supply support for the above. This makes it easier for us to track issues and will stop the announcement post from becoming saturated with support queries.

    In your forum post you should include a link to a running test case. Information on how to create a test case (if you aren't able to link to the page you are working on) is available here.

    Thanks,
    Sandy

Sign In or Register to comment.