Caching Data on a Heavy Load Web Server

2019-09-06 20:29发布

I currently have a web application which on each page request gets user data out of a database, for the currently logged in user.

This web application could have approximately 30 thousands concurrent users.

My question is would it be best to cache this. For example in C# using System.Web.HttpRuntime.Cache.Add

or would this cripple the servers memory storing up to 30 thousand user objects in the memory?

Would it be better to not cache and just get the required data from the database on each request?

2条回答
爱情/是我丢掉的垃圾
2楼-- · 2019-09-06 20:42

At that scale you need an explicit caching and scaling strategy. Hacking together a cache is different from planning an explicit strategy. Hacking together a cache will fail.

Caching is highly dependent upon the data. Does the data change frequently? What ratio of reads-to-writes are you going to have? How are you going to scale your database? What happens if the servers in your web farm have different values for the data? Is cache consistency critical?

You'll probably end up with several different types of caching:

  1. IIS Static Caching
  2. ASP.Net Caching
  3. An LRU cache in your app.
  4. An in-memory distributed cache such as MemCacheD.
  5. HTTP caching in the browser.

Also, if you're serving static data (Images, CSS, Javascript, etc) you'll want to integrate with a CDN for delivery. This is easy do with with AWS S3 or Azure Storage.

You'll also want to make sure you plan how to scale out from the get-go. You'll probably want to deploy to a cloud provider such as AWS with Elastic Bean Stalk or Azure's Websites Infrastruture.

查看更多
叼着烟拽天下
3楼-- · 2019-09-06 20:56

How about having some limited cache as per available memory. And write an LRU(Least Recently used) algorithm on top of it. It may improve the performance especially for the frequently visited users.

查看更多
登录 后发表回答