I currently have a web application which on each page request gets user data out of a database, for the currently logged in user.
This web application could have approximately 30 thousands concurrent users.
My question is would it be best to cache this. For example in C# using System.Web.HttpRuntime.Cache.Add
or would this cripple the servers memory storing up to 30 thousand user objects in the memory?
Would it be better to not cache and just get the required data from the database on each request?
At that scale you need an explicit caching and scaling strategy. Hacking together a cache is different from planning an explicit strategy. Hacking together a cache will fail.
Caching is highly dependent upon the data. Does the data change frequently? What ratio of reads-to-writes are you going to have? How are you going to scale your database? What happens if the servers in your web farm have different values for the data? Is cache consistency critical?
You'll probably end up with several different types of caching:
Also, if you're serving static data (Images, CSS, Javascript, etc) you'll want to integrate with a CDN for delivery. This is easy do with with AWS S3 or Azure Storage.
You'll also want to make sure you plan how to scale out from the get-go. You'll probably want to deploy to a cloud provider such as AWS with Elastic Bean Stalk or Azure's Websites Infrastruture.
How about having some limited cache as per available memory. And write an LRU(Least Recently used) algorithm on top of it. It may improve the performance especially for the frequently visited users.