Skip to content

Instantly share code, notes, and snippets.

@stormwild
Last active January 5, 2026 19:18
Show Gist options
  • Select an option

  • Save stormwild/15c06af6cfc3979bac575fd822f87882 to your computer and use it in GitHub Desktop.

Select an option

Save stormwild/15c06af6cfc3979bac575fd822f87882 to your computer and use it in GitHub Desktop.
MemoryCache

Mastering .NET 8 MemoryCache

MemoryCache Guide

MemoryCache in .NET 8 is a powerful, in-memory key-value store used to improve application performance by storing frequently accessed data in the server's RAM. It avoids repeated, expensive calls to databases or external APIs.

Here is a guide to using MemoryCache in .NET 8, moving from setup to advanced patterns.


1. Setup and Registration

First, you need to register the IMemoryCache service in your Dependency Injection (DI) container. This is typically done in your Program.cs file.

Program.cs:

var builder = WebApplication.CreateBuilder(args);

// Register MemoryCache
builder.Services.AddMemoryCache(); 

var app = builder.Build();

2. Basic Usage (Dependency Injection)

Once registered, you can inject IMemoryCache into any service or controller.

Example Service:

using Microsoft.Extensions.Caching.Memory;

public class WeatherService(IMemoryCache memoryCache) // Primary Constructor (.NET 8)
{
    private const string CacheKey = "weather_data";

    public async Task<string> GetWeatherForecastAsync()
    {
        // 1. Check if data exists in cache
        if (memoryCache.TryGetValue(CacheKey, out string? cachedForecast))
        {
            Console.WriteLine("Fetching from Cache...");
            return cachedForecast!;
        }

        // 2. Data not in cache, simulate fetching from DB/API
        Console.WriteLine("Fetching from Database...");
        await Task.Delay(1000); 
        string forecast = "Sunny, 25°C";

        // 3. Save to cache with options
        var cacheOptions = new MemoryCacheEntryOptions()
            .SetAbsoluteExpiration(TimeSpan.FromMinutes(5));

        memoryCache.Set(CacheKey, forecast, cacheOptions);

        return forecast;
    }
}

3. The "One-Liner" Approach (GetOrCreate)

The GetOrCreateAsync extension method is the cleanest way to use MemoryCache. It handles the "check, fetch, and save" logic in a single block.

public async Task<string> GetWeatherOptimizedAsync()
{
    // This looks for the key "weather_data". 
    // If found, it returns it. 
    // If NOT found, it executes the function, caches the result, and returns it.
    return await memoryCache.GetOrCreateAsync("weather_data", async entry =>
    {
        // Configure expiration policies for this entry inside the block
        entry.AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(5);
        entry.SlidingExpiration = TimeSpan.FromMinutes(2);

        // Simulate expensive data fetch
        await Task.Delay(1000); 
        return "Sunny, 25°C";
    }) ?? "No Data";
}

4. Expiration Policies (Critical)

You must define when items are removed to prevent your server from running out of RAM.

Policy Description Use Case
Absolute Expiration The item dies after a fixed time (e.g., 10 mins), regardless of usage. Data that must eventually refresh (e.g., Stock prices).
Sliding Expiration The item dies only if it hasn't been accessed for a specific time (e.g., 5 mins). Keeping active user sessions alive.
Priority Determines which items to delete first if memory is full. CacheItemPriority.High for critical reference data.

Best Practice: Combine both. Use Sliding to keep active data warm, but set an Absolute limit to force a refresh eventually.

var options = new MemoryCacheEntryOptions()
    .SetSlidingExpiration(TimeSpan.FromMinutes(2)) // Keep alive if used
    .SetAbsoluteExpiration(TimeSpan.FromHours(1)); // But kill after 1 hour max

5. Advanced Configuration: Size Limits

By default, MemoryCache will use as much memory as it can. In a production environment (especially containers like Docker/Kubernetes), you should limit this to prevent crashing the app.

Program.cs (Configuration):

builder.Services.AddMemoryCache(options =>
{
    options.SizeLimit = 1024; // Arbitrary unit count (e.g., 1024 items)
});

Usage (Must specify size): When SizeLimit is set globally, every single cache entry must specify a size.

var options = new MemoryCacheEntryOptions
{
    Size = 1 // This entry counts as "1" unit towards the 1024 limit
};
memoryCache.Set("key", "value", options);

6. Common Pitfalls & Thread Safety

  1. Race Conditions (Cache Stampede): IMemoryCache itself is thread-safe, but the logic inside GetOrCreate is not locked globally. If 100 users request the same missing key simultaneously, the "factory" function (the database call) might run 100 times.
  • Fix: For extremely high-load scenarios, use a SemaphoreSlim to lock the data fetching logic.
  1. Stale Data: Always assume cached data might be old. Don't cache rapidly changing real-time data (like a "Seconds Remaining" timer) unless you have a very short expiration.
  2. Memory Leaks: Never blindly accept user input as a cache key (e.g., cache.Set(userInput, value)). A malicious user could send millions of random keys and fill up your RAM.

Next Step

Would you like to see how to implement the "Cache Stampede" protection using a SemaphoreSlim to ensure your database is protected during high load?

ASP.NET Core In-Memory Caching This video is relevant because it demonstrates how to practically implement IMemoryCache within a repository pattern, specifically highlighting the dependency injection and logic flow discussed above.

Yes, there absolutely is a GetOrCreateAsync method in .NET 8.

However, it is an extension method, not a method directly on the IMemoryCache interface. This means you must include the correct using statement for it to appear in your IntelliSense.

1. The Missing Piece

To see the method, ensure you have this namespace at the top of your file:

using Microsoft.Extensions.Caching.Memory;

Without this line, the compiler will complain that IMemoryCache does not contain a definition for GetOrCreateAsync.


2. How to use it

Here is the standard pattern for .NET 8.

public async Task<User> GetUserAsync(int userId)
{
    string cacheKey = $"user_{userId}";

    // usage: await _memoryCache.GetOrCreateAsync(key, factoryMethod)
    var user = await _memoryCache.GetOrCreateAsync(cacheKey, async entry =>
    {
        // 1. Configure cache options inside this block
        entry.SetAbsoluteExpiration(TimeSpan.FromMinutes(10));
        entry.SetSlidingExpiration(TimeSpan.FromMinutes(2));

        // 2. Perform the logic to get data if it wasn't in cache
        return await _database.Users.FindAsync(userId);
    });

    return user;
}

3. Important Nuances in .NET 8

A. Nullable Warning (?)

In .NET 8 (with nullable reference types enabled), GetOrCreateAsync returns TItem? (nullable). If you are certain your database will return a value (or you don't care about the null warning), you may need the "null-forgiving operator" (!) at the end, or handle the null explicitly:

// The '?' indicates result might be null. The '!' suppresses the warning if you know it's safe.
User user = (await _memoryCache.GetOrCreateAsync("key", factory))!;

B. It is NOT Thread-Locked (The "Stampede")

While GetOrCreateAsync is convenient, it does not guarantee that your factory method (the database call) runs only once.

  • Scenario: If 50 users hit your endpoint at the exact same millisecond and the cache is empty, GetOrCreateAsync might fire the database query 50 times before the first one finishes and caches the result.
  • If you have high-concurrency needs, you usually need a lock (Semaphore) or a library like FusionCache or ZiggyCreatures.

Summary

  • Does it exist? Yes.
  • Where? Microsoft.Extensions.Caching.Memory namespace.
  • Type? Extension Method.

Next Step

Would you like the code snippet for the SemaphoreSlim pattern to fix the "Thread-Locked" issue mentioned above? (This ensures the database is called exactly once, even if 50 users hit it simultaneously).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment