MemoryCache in .NET 8 is a powerful, in-memory key-value store used to improve application performance by storing frequently accessed data in the server's RAM. It avoids repeated, expensive calls to databases or external APIs.
Here is a guide to using MemoryCache in .NET 8, moving from setup to advanced patterns.
First, you need to register the IMemoryCache service in your Dependency Injection (DI) container. This is typically done in your Program.cs file.
Program.cs:
var builder = WebApplication.CreateBuilder(args);
// Register MemoryCache
builder.Services.AddMemoryCache();
var app = builder.Build();Once registered, you can inject IMemoryCache into any service or controller.
Example Service:
using Microsoft.Extensions.Caching.Memory;
public class WeatherService(IMemoryCache memoryCache) // Primary Constructor (.NET 8)
{
private const string CacheKey = "weather_data";
public async Task<string> GetWeatherForecastAsync()
{
// 1. Check if data exists in cache
if (memoryCache.TryGetValue(CacheKey, out string? cachedForecast))
{
Console.WriteLine("Fetching from Cache...");
return cachedForecast!;
}
// 2. Data not in cache, simulate fetching from DB/API
Console.WriteLine("Fetching from Database...");
await Task.Delay(1000);
string forecast = "Sunny, 25°C";
// 3. Save to cache with options
var cacheOptions = new MemoryCacheEntryOptions()
.SetAbsoluteExpiration(TimeSpan.FromMinutes(5));
memoryCache.Set(CacheKey, forecast, cacheOptions);
return forecast;
}
}The GetOrCreateAsync extension method is the cleanest way to use MemoryCache. It handles the "check, fetch, and save" logic in a single block.
public async Task<string> GetWeatherOptimizedAsync()
{
// This looks for the key "weather_data".
// If found, it returns it.
// If NOT found, it executes the function, caches the result, and returns it.
return await memoryCache.GetOrCreateAsync("weather_data", async entry =>
{
// Configure expiration policies for this entry inside the block
entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5);
entry.SlidingExpiration = TimeSpan.FromMinutes(2);
// Simulate expensive data fetch
await Task.Delay(1000);
return "Sunny, 25°C";
}) ?? "No Data";
}You must define when items are removed to prevent your server from running out of RAM.
| Policy | Description | Use Case |
|---|---|---|
| Absolute Expiration | The item dies after a fixed time (e.g., 10 mins), regardless of usage. | Data that must eventually refresh (e.g., Stock prices). |
| Sliding Expiration | The item dies only if it hasn't been accessed for a specific time (e.g., 5 mins). | Keeping active user sessions alive. |
| Priority | Determines which items to delete first if memory is full. | CacheItemPriority.High for critical reference data. |
Best Practice: Combine both. Use Sliding to keep active data warm, but set an Absolute limit to force a refresh eventually.
var options = new MemoryCacheEntryOptions()
.SetSlidingExpiration(TimeSpan.FromMinutes(2)) // Keep alive if used
.SetAbsoluteExpiration(TimeSpan.FromHours(1)); // But kill after 1 hour maxBy default, MemoryCache will use as much memory as it can. In a production environment (especially containers like Docker/Kubernetes), you should limit this to prevent crashing the app.
Program.cs (Configuration):
builder.Services.AddMemoryCache(options =>
{
options.SizeLimit = 1024; // Arbitrary unit count (e.g., 1024 items)
});Usage (Must specify size):
When SizeLimit is set globally, every single cache entry must specify a size.
var options = new MemoryCacheEntryOptions
{
Size = 1 // This entry counts as "1" unit towards the 1024 limit
};
memoryCache.Set("key", "value", options);- Race Conditions (Cache Stampede):
IMemoryCacheitself is thread-safe, but the logic insideGetOrCreateis not locked globally. If 100 users request the same missing key simultaneously, the "factory" function (the database call) might run 100 times.
- Fix: For extremely high-load scenarios, use a
SemaphoreSlimto lock the data fetching logic.
- Stale Data: Always assume cached data might be old. Don't cache rapidly changing real-time data (like a "Seconds Remaining" timer) unless you have a very short expiration.
- Memory Leaks:
Never blindly accept user input as a cache key (e.g.,
cache.Set(userInput, value)). A malicious user could send millions of random keys and fill up your RAM.
Would you like to see how to implement the "Cache Stampede" protection using a SemaphoreSlim to ensure your database is protected during high load?
ASP.NET Core In-Memory Caching
This video is relevant because it demonstrates how to practically implement IMemoryCache within a repository pattern, specifically highlighting the dependency injection and logic flow discussed above.
Yes, there absolutely is a GetOrCreateAsync method in .NET 8.
However, it is an extension method, not a method directly on the IMemoryCache interface. This means you must include the correct using statement for it to appear in your IntelliSense.
To see the method, ensure you have this namespace at the top of your file:
using Microsoft.Extensions.Caching.Memory;Without this line, the compiler will complain that IMemoryCache does not contain a definition for GetOrCreateAsync.
Here is the standard pattern for .NET 8.
public async Task<User> GetUserAsync(int userId)
{
string cacheKey = $"user_{userId}";
// usage: await _memoryCache.GetOrCreateAsync(key, factoryMethod)
var user = await _memoryCache.GetOrCreateAsync(cacheKey, async entry =>
{
// 1. Configure cache options inside this block
entry.SetAbsoluteExpiration(TimeSpan.FromMinutes(10));
entry.SetSlidingExpiration(TimeSpan.FromMinutes(2));
// 2. Perform the logic to get data if it wasn't in cache
return await _database.Users.FindAsync(userId);
});
return user;
}In .NET 8 (with nullable reference types enabled), GetOrCreateAsync returns TItem? (nullable).
If you are certain your database will return a value (or you don't care about the null warning), you may need the "null-forgiving operator" (!) at the end, or handle the null explicitly:
// The '?' indicates result might be null. The '!' suppresses the warning if you know it's safe.
User user = (await _memoryCache.GetOrCreateAsync("key", factory))!;While GetOrCreateAsync is convenient, it does not guarantee that your factory method (the database call) runs only once.
- Scenario: If 50 users hit your endpoint at the exact same millisecond and the cache is empty,
GetOrCreateAsyncmight fire the database query 50 times before the first one finishes and caches the result. - If you have high-concurrency needs, you usually need a lock (Semaphore) or a library like
FusionCacheorZiggyCreatures.
- Does it exist? Yes.
- Where?
Microsoft.Extensions.Caching.Memorynamespace. - Type? Extension Method.
Would you like the code snippet for the SemaphoreSlim pattern to fix the "Thread-Locked" issue mentioned above? (This ensures the database is called exactly once, even if 50 users hit it simultaneously).