Redis and Other Caching Strategies for Optimized Performance
Source code of this tutorial
Caching is widely used to improve performance in web applications, reduce database load, and manage state in distributed systems. But it’s not limited to optimized some of the places it’s need to manage your system properly.
- In-memory caching
- Distributed Cache
- Response caching
- Output caching
In-memory caching: The simplest cache is based on the IMemoryCache. IMemoryCache represents a cache stored in the memory of the web server. Apps running on a server farm (multiple servers) should ensure sessions are sticky when using the in-memory cache. Sticky sessions ensure that requests from a client all go to the same server.
When we will use?
- For data that doesn’t need to persist after a system restart or failure (non-persistent data), using an in-memory cache is ideal. like we want to limit call (Ratelimit)
- If performance is critical and you need to reduce latency, in-memory caches like Redis or Memcached can provide sub-millisecond access times compared to querying databases or API calls.
- If your application is hosted in a single server then you can use MemoryCache
A simple example how to use?
First Configured memory cache service in your Program.cs
builder.Services.AddMemoryCache();
Then use by following way
[ApiController]
[Route("[controller]")]
public class WeatherForecastController : ControllerBase
{
private static readonly string[] Summaries = new[]
{
"Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching"
};
private readonly ILogger<WeatherForecastController> _logger;
private readonly IMemoryCache _memoryCache;
public WeatherForecastController(ILogger<WeatherForecastController> logger, IMemoryCache memoryCache)
{
_logger = logger;
_memoryCache = memoryCache;
}
[HttpGet(Name = "GetWeatherForecast")]
public IEnumerable<WeatherForecast> Get()
{
string cacheKey = "GetWeatherForecast";
// List<WeatherForecast> forecastList = new List<WeatherForecast>();
if (!_memoryCache.TryGetValue(cacheKey, out List<WeatherForecast> cacheValue))
{
cacheValue = Enumerable.Range(1, 5).Select(index => new WeatherForecast
{
Date = DateOnly.FromDateTime(DateTime.Now.AddDays(index)),
TemperatureC = Random.Shared.Next(-20, 55),
Summary = Summaries[Random.Shared.Next(Summaries.Length)]
}).ToList();
var cacheEntryOptions = new MemoryCacheEntryOptions()
.SetSlidingExpiration(TimeSpan.FromMinutes(1));
_memoryCache.Set(cacheKey, cacheValue, cacheEntryOptions);
}
return cacheValue;
}
}
When not?
- We should consider about app server memory. Usually it will take more memory.
- If your application deals with data that must persist after a restart or crash, in-memory caching is risky because its contents are volatile and lost if the server crashes.
- If the data set is too large to fit in memory, it might not be cost-effective or practical to cache it. Caching parts of the data can help, but relying solely on in-memory solutions may lead to high memory costs.
- If you used multiple server it will not sync with other server.
- For applications with low traffic or low read requests, caching might not provide much benefit, and the additional complexity may not be worth it.
Distributed Cache: A distributed cache is a cache shared by multiple app servers, typically maintained as an external service to the app servers that access it. A distributed cache can improve the performance and scalability of an ASP.NET Core app, especially when the app is hosted by a cloud service or a server farm.
When cached data should distributed:
- You can increase your cache size by adding more nodes, allowing your application to scale to handle more users or requests without overwhelming a single machine.
- With replication and multiple nodes, distributed caches can handle node failures without data loss or significant performance degradation.
- The load is spread across multiple nodes, preventing any single server from becoming a bottleneck.
- For geographically distributed applications, a distributed cache allows users to access cached data from the nearest node, reducing latency.
When not?
- If your application doesn’t require high scalability or operates on a small scale, a distributed cache may introduce unnecessary complexity.
- For applications with low traffic, a distributed cache may not justify the overhead of managing multiple nodes and network communication.
- Maintaining multiple nodes for a distributed cache can increase infrastructure costs, especially when operating in cloud environments.
- Implementing and managing a distributed cache is more complex than a single-node in-memory cache. It requires proper configuration, monitoring, and failover strategies.
- Although distributed caches reduce latency compared to database queries, fetching data from a distributed cache may introduce slight network latency compared to a purely in-memory cache.
Redis as a Distributed Cache
Let’s implement a simple example
Install the StackExchangeRedi spackage
dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis
// then configure program. cs
builder.Services.AddStackExchangeRedisCache(options =>
{
// Specify the Redis connection string
options.Configuration = "localhost:6379"; // Your Redis connection string (or use from configuration)
});
[Route("api/[controller]")]
[ApiController]
public class DistributedCacheController : ControllerBase
{
private static readonly string[] Summaries = new[]
{
"Freezing", "Bracing", "Chilly", "Cool", "Mild", "Warm", "Balmy", "Hot", "Sweltering", "Scorching"
};
private readonly ILogger<WeatherForecastController> _logger;
private readonly IDistributedCache _distributedCache;
public DistributedCacheController(ILogger<WeatherForecastController> logger, IDistributedCache distributedCache)
{
_logger = logger;
_distributedCache = distributedCache;
}
[HttpGet(Name = "GetDistributedCacheData")]
public string GetDistributedCacheData()
{
string cacheKey = "sampleKey";
var cachedValue = _distributedCache.GetString(cacheKey);
string returnValue = "";
if (cachedValue == null)
{
var options = new DistributedCacheEntryOptions()
.SetSlidingExpiration(TimeSpan.FromMinutes(5)); // Sets expiration after 5 minutes of inactivity
// Set the cache
_distributedCache.SetString(cacheKey, "This is cached data.", options);
return "data without cache";
}
else
{
return "data from cache:" + cachedValue;
}
}
}
Benefits of Using Redis as a Distributed Cache:
1. Scalability: Redis allows you to scale your cache horizontally across multiple instances.
2. Persistence: You can configure Redis to persist data to disk, unlike the default AddDistributedMemoryCache.
3. Data Structures: Redis supports rich data structures (strings, hashes, sets, etc.), giving you more flexibility in how you cache your data.
4. We can failover our Redis server.
Response Cache
Typically used when you want to optimize the delivery of static or semi-static content (e.g., pages that don’t change often like a product catalog, or API results). It can cache the full response and prevent the server from regenerating the same response for multiple requests.
- Caches HTTP responses to avoid regenerating them for each request. It stores the entire response (HTML, JSON, etc.) and serves it directly to the client without re-executing the action method on subsequent identical requests. Use case: When you want to cache the full output of an action for multiple users, for instance, caching a product page or API response that doesn’t change frequently.
- Can be cached at client-side (browser), server-side (ASP.NET Core), or on intermediate proxies.
- It uses HTTP headers such as
Cache-Control
,Expires
, andETag
to inform clients or proxies how long the response is valid.
A simple Example of Response Cache
[Route("api/[controller]")]
[ApiController]
public class ResponseCacheController : ControllerBase
{
private readonly ILogger<WeatherForecastController> _logger;
public ResponseCacheController(ILogger<WeatherForecastController> logger)
{
_logger = logger;
}
[HttpGet(Name = "ResponseCache")]
[ResponseCache(Duration = 60, Location = ResponseCacheLocation.Client)]
public string ResponseCache()
{
return "Data from response cache:";
}
}
Tools we can use:
Redis insight: Tools to check redis server status in GUI mode
More you can learn
Monitoring Cache Performance