How to implement Cache-Aside pattern in ASP.NET Core

Introduction

Caching is an important technique applied for performance optimizing applications. Caching frequently accessed data can eliminate the need for backend queries and can significantly reduce the overall response times.

It is important to understand how to implement caching in application services. Since different applications have different business requirements and data considerations, it is best recommended to design a caching layer based on these factors.

Speaking of how to cache data, there are several popular caching patterns that applications use to store and query cache.

The following are some of the most important caching patterns –

  1. Cache Aside pattern
  2. Read Through / Write Through pattern
  3. Write Behind pattern

What is Cache-Aside pattern

In the cache-aside pattern, data is loaded into the cache in a lazy loading approach.

A service loads the cache with data fetched from the backend when it doesn’t find an entry in the cache. It is loaded with data only when needed and the microservice decides if the data needs to be cached.

Credits: Cache Aside Pattern – Azure Architecture Center

When the requested data is present in the cache, the application service fetches directly from the data and returns and there is no backend query executed.

On the other hand, if the data is not present in the cache, the microservice queries data from the backend and then writes it to the cache before returning a response. In this case there is both a database query and a cache write.

Conceptually, there are two scenarios

  • Cache miss – when the object is not present in the cache, the service makes a query to the backend and fetches the data. It then adds these objects to the cache, depending on if the data will be accessed later. The service then returns the data as response.
  • Cache hit – when the object is already present in the cache, the service queries data from the cache directly and returns as response. There is no backend query required and it significantly reduces the execution time.

When to use Cache-Aside pattern

Applications can use this approach for frequently accessed data when the service makes multiple real-time calls to fetch data from the backend.

When there is any update to the data that is currently cached, the service needs to notify the cache to invalidate the object and remove it from the cache.

Benefits of using Cache-Aside pattern

  1. Easy to implement – cache aside is the simplest and an easy to implement caching pattern
  2. Application Driven – The application service is responsible for determining which data to store and adding objects to the cache
  3. Caching only when required – Since the application drives the caching, only the data that is required will be stored and hence there is a positive utilization of memory

Key Considerations while using Cache-Aside pattern

  • Twice as Fast or Slow – The application response times vary with the availability of data in the cache. When there is an object available in the cache, the response is twice as fast. If it doesn’t the response is twice as slow, since the application needs to query the backend and store it in the cache.
  • Cold and Warm – Initially when the application boots up, the cache is cold – it is empty as the application has not yet loaded any data in the cache. As the application receives more and more requests, the cache is loaded with data as required – cache is now warm.

Implementing C# Cache-Aside pattern with an Example

As mentioned above, the application loads an object into the cache only when required. Otherwise it returns the data that is available in the cache.

The pseudo code of a simple caching function looks like below –

get_entity(entity_id)
    entity = cache.get(entity_id)
    if (entity == null)
        entity = db.query("SELECT * FROM Items WHERE id = {0}", entity_id)
        cache.set(entity_id, entity)
    return entity

In .NET, we can implement this as below

To demonstrate, let us assume we have an EntityService class that encapsulates two functionalities.

  • GetEntityById method returns a single Entity that matches the entityId passed.
  • GetAllEntities method returns all the entities available in the backend.

We implement our Cache-Aside pattern while returning a single Entity from the backend. This is because of the Key consideration that we will store a single entity as required rather than the entire data collection.

There are still certain scenarios where we may want to cache the entire data collection, but generally that is on the client side.

The EntityService class looks like below –

public interface IEntityService
{
    Entity GetEntityById(int entityId);
    IEnumerable<Entity> GetAllEntities();
}

public class EntityService : IEntityService
{
    private const int TIME_TO_LIVE_IN_SECONDS = 600;

    private readonly IEntityRepository _entityRepository;
    private readonly ICachingService<Entity> _cachingService;

    public EntityService(IEntityRepository entityRepository, ICachingService<Entity> cache)
    {
        _entityRepository = entityRepository;
        _cachingService = cache;
    }

    public Entity GetEntityById(int entityId)
    {
        if (!_cachingService.TryGetCachedEntity(entityId, out Entity Entity))
        {
            var record = _entityRepository.GetEntityById(entityId);

            if (record != null)
            {
                return _cachingService.AddEntityToCache(record.Id, record, TIME_TO_LIVE_IN_SECONDS);
            }
        }

        return Entity;
    }

    public IEnumerable<Entity> GetAllEntities()
    {
        return _entityRepository.Entities.ToList();
    }
}

The caching functionality is encapsulated in a CachingService class, where we can decide on where to cache.

For example, a CachingService class implementation that uses InMemory caching is as below –

public interface ICachingService<TClass> where TClass : class
{
    TClass AddEntityToCache(object cacheKey, TClass entity, int timeToLive);
    bool TryGetCachedEntity(object cacheKey, out TClass cachedEntity);
}

public class CachingService<TClass> : ICachingService<TClass> where TClass : class
{
    private readonly IMemoryCache _inMemoryCache;

    public CachingService(IMemoryCache cache)
    {
        _inMemoryCache = cache;
    }

    public bool TryGetCachedEntity(object cacheKey, out TClass cachedEntity)
    {
        if (!_inMemoryCache.TryGetValue(cacheKey, out cachedEntity))
        {
            return false;
        }

        return true;
    }

    public TClass AddEntityToCache(object cacheKey, TClass entity, int timeToLive)
    {
        var cacheOptions = new MemoryCacheEntryOptions();
        cacheOptions.AbsoluteExpirationRelativeToNow = TimeSpan.FromSeconds(timeToLive);

        _inMemoryCache.Set(cacheKey, entity, cacheOptions);
        return entity;
    }
}

Buy Me A Coffee
Enjoyed my content and found it useful?
Consider buying me a coffee to support my work and keep the content flowing!

Conclusion

Cache-Aside pattern is a simple caching technique where the data is lazy loaded into the cache when required. The application service is responsible for deciding on which objects to store on the cache and drives the caching mechanism.

This pattern is best suited for applications where real-time data is requested by the client such as RESTful API calls from a transactional database.

When there is a data change, the service can invalidate the cache entry and a subsequent cache miss loads the cache with new data.

Some key considerations to be made while implementing Cache-Aside pattern is that the cache is cold at first and is warmed up as the application adds more data to the cache.

We also get a double penalty when there is a cache miss, which is manageable compared to making database queries each time a request is made.

Do you think a cache-aside pattern helps improve your application performance? Let me know in the comments below 👇

Sriram Mannava
Sriram Mannava

A software Engineer from Hyderabad, India with close to 9 years experience as a full stack developer in Angular and .NET space.

Articles: 5

3 Comments

Comments are closed.