Cache large volume of json result on client side

2019-03-16 01:10发布

I've a asp.net mvc application which returns the JSON result containing upto n number of years worth of data which then gets rendered on Javascript chart.

In order to have a good user experience (in terms of performance) I'm looking for the best solution whether it's possible to cache the JSON data on client side so when user clicks on Chart with different parameters such as day, week view etc, the same JSON data is queried without hitting a server.

Could someone please help us to make a best decision on caching best practices on whether the data should be cached on client side or server side or should directly hit database for each graph toggle?

Thanks in advance.

3条回答
放我归山
2楼-- · 2019-03-16 01:52
  1. Retrive the data from database and save as a static file at server. Give a .css or .png extension. (Browser will automatically cache the stylesheet and image files.).
  2. Save the data filename with timestamp in a hidden field.( to make sure loading the latest file from server if there is a change to the file)
  3. Load the file from server using AJAX, First time it will load from server, but next time it will load from browser cache.
  4. You can use JSON.Parse() to Parse AJAX request result.
查看更多
贪生不怕死
3楼-- · 2019-03-16 01:54

I have done what you are trying to do and here is my experience. I use Oracle's Site Studio middleware at work. I looked for a framework that would work with it but couldn't find one. So I have tried both options below.

1) The database query returns a certain amount of rows. I tried 2,000 as a test. A simple foreach loop converts the returned data into JSON data. So it literally builds up a long array of JSON variables as it loops through rows. In this way you are mimicking a snapshot of a local dababase. JS can actually access the array elements very quickly and it might surprise you how fast you can sort, alter, delete information.

<script>
var appData = [{'id':'0','first':'Sam','last':'Smith'},{'id':'1','first':'Dan','last':'Smith'}];
</script>

This JSON data is contained within a script tag. JQuery on doc.ready then reads the data and adds it to the html text as needed. When a user changes the JSON data value ajax fires off and saves the changes to the database. It is not too difficult to add a system like this to your application. I used Google's Angular.js later to bind data to the UI to have a clean MV pattern and it is also easy to do fast client side sorts and filtering. As already mentioned Backbone.js and other JS frameworks can synchronize client data to server.

2) The second way I saved data to an html page is to once again loop through the returned rows with a foreach. Then I saved the data in the HTML using old fashioned

<input type="hidden" name="someName" value="someValue" />

I then used JQuery to process the data and add it to the UI. If you really want to get wild with JSON you can actually embed it in HTML variables like so

<input type="hidden name="row1" value="{'color':'red','style':'fancy','age':'44'}" />

You can then use JQuery or Angular.js to process the data and bind it to your UI.

It is interesting that a lot of application frameworks don't have a built in client side caching system. It really is inefficient to sort a select menu on the server side and then rebuild the html. Better to sort it in JS and dynamically rebuild the select menu. There is a concern with security here and you wouldn't want to print private information into JSON or HTML variables since it is visible under view source. Also you can embed data in pages using more rogue techniques. Consider below:

<span class="data" value="1234"></span>
$(function () { 
    $('.data').each( function() {
        var val = $(this).attr('value');
        console.log(val); //process data
    });
});

You can then use JQuery on doc.ready to process classes named data. You can also stuff JSON data into the value and parse it out later. Keep in mind the JQuery folks are against developers using classes in this way. In my experience if you don't go overboard with it it works great.

查看更多
爷、活的狠高调
4楼-- · 2019-03-16 02:02

First of all, where is the database? If you are on a local network with gigabit LAN, then hitting it won't be a problem. However, that is not true over the internet. People have limited bandwidth, especially on mobile, and thus you should limit your HTTP calls. Also, less HTTP calls means less strain on the server.

Here are some tips:

  • Consider pagination

    When loading "2 years worth", I imagine a lot, like a 100+ page thesis. Consider paginating data instead of loading them all at once. This saves you bandwidth as well as cache space (If ever it's limited).

    How to: Have the server script slice up the data according to what the client wants. It's pretty easy to create pagination in SQL using LIMIT in the query. The logic is like starting_item = (page_needed - 1) * items_per_page

  • JSONify data

    Use JSON for transporting data to and from the network. Aside from being lightweight, it's also structured. It will be easier to parse and store later on.

    How to: PHP has a json_encode function to convert arrays into JSON strings. I assume your framework has a similar feature. Have the string echoed on a page then use JSON.parse to convert from JSON string to a JS object. JSON methods come native in modern browsers but if you need to cater old browsers, Crockford has a library to parse it

  • Use a well known storage framework

    If a persistent storage is needed for cache across page, I recently came across PersistJS which abstracts localStorage to ones available on the browser. Also, here's a JS implementation of LZW. Keep it handy since localstorage use strings to store data and it has a 5-10MB limit.

    How to: convert the data into a string using JSON.stringify and store it with PersistJS. Then for retrieval, get the string and parse it back using JSON.parse()

  • Call only when needed

    Have the cache system only call the server if something is modified, added or if something isn't there. If the data is there, why should you call the server for it?

  • Sync the cache

    If you fear of stale data, then have some AJAX sync your cache system by using some method of live data fetching as described in this wiki about Comet.

The last two points depend on your cache framework. But BackboneJS allows it's models and collections to sync to the server, which have the same functionality I mentioned.

查看更多
登录 后发表回答