Implementing a good caching strategy is fundamental to achieve good performance. Besides that, it is not a trivial task.
There are only two hard things in Computer Science: cache invalidation and naming things. – Phil Karlton
Although I agree with Mr. Karlton, I firmly believe that life would be more comfortable if the only thing we need to take care when implementing some server-side caching were cached data invalidation.
There is no silver bullet
Unfortunately, there is no “one size fits all” strategy for server-side caching. Elton Stoneman recommends guideline criteria that can make the decision easier. Following his recommendations, we should consider:
- Cost – how computationally expensive or time-consuming is fetching the data from the owner.
- Breadth – how reusable is the cached data throughout the solution
- Longevity – how frequently the data needs to be invalidated
- Size – what is the size of the cached object
There are numerous options for implementing caching. For example:
- In-Process and In-Memory caching
- Out-of-Process caching
- Caching using a Remote Store
Each option has good parts and bad parts. For instance, I have a client who was experiencing high CPU utilization rates because of the Garbage Collector. After a short investigation, we discovered that the cause was a naïve In-Process and In-Memory caching strategy.
Practical recommendations
Here are some reasonable suggestions for you when designing server-side caching:
- Define what you should and what you should not to cache.
- Assume that you will probably need more than one cache implementation for your application
- Consider the garbage collector impact before adopting an In-Process and In-Memory cache (it is easy to use, but potentially dangerous)
- Consider the network impact before caching bigger objects in remote stores
- Measure, measure, measure