Let's say you have an Excel file with 10,000 rows or a database query with 30,000 rows. Showing these data as a table is not a trivial task, because if you simply load all the data along with the web page, not only it will take too much time but it's possible that the client's browser will crash due to memory or processing limitations. This is especially true for older mobile devices.
The solution to this challenge is to load only the part of the table that is actually needed and leave the rest of the table data on the server. The table data are fetched from the server using AJAX calls, so no page reloading is necessary. The end result is that the page loading is super fast, the DOM model is much smaller and therefore the memory requirements very small.
Normally this would take a lot of effort to build such solution, but with Tabulizer this is as simple as checking as single option. Below we give the steps of how to load an Excel file with 30,000 rows into a dynamic table via a data source with the server side option enabled.
Step 1: Create the table ruleset.
All rulesets that are going to be used along with server side processing must use the pagination feature. Other rules/features can be used as well, but it's good to keep it simple.
Step 2: Create the data source.
The data source will link the table with the Excel file and must have the following options enabled:
- Server Side Processing.
- Cache. It's recommended that you use a really large value for the cache time. If you want to update the source earlier (e.g. the Excel file in this example) you can simply clear the data source cache manually instead of waiting to expire.
Step 3: Insert the data source into your article or post.
Using the editor's data source button you add the created data source. Note that the first time the page loading with take significant amount of time because the Excel file will be extracted (or any other source you have), the ruleset will be processed and the data source cache will get populated.
Below is a sample table created with this process. It takes its input from an Excel file with 10,000 rows. Notice how fast it loads and the response time of the searches you execute.
|Loading data from server ...|