Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

https://github.com/TurnerSoftware/CacheTower

An efficient multi-layered caching system for .NET
https://github.com/TurnerSoftware/CacheTower

cache cachemanager caching caching-library distributed dotnet json memory-cache protobuf redis redis-caching

Last synced: about 1 month ago
JSON representation

An efficient multi-layered caching system for .NET

Lists

README

        

๏ปฟ

![Icon](images/icon.png)
# Cache Tower
An efficient multi-layered caching system for .NET

![Build](https://img.shields.io/github/actions/workflow/status/TurnerSoftware/CacheTower/build.yml?branch=main)
[![Codecov](https://img.shields.io/codecov/c/github/turnersoftware/cachetower/main.svg)](https://codecov.io/gh/TurnerSoftware/CacheTower)
[![NuGet](https://img.shields.io/nuget/v/CacheTower.svg)](https://www.nuget.org/packages/CacheTower/)

## Overview

Computers have multiple layers of caching from L1/L2/L3 CPU caches to RAM or even disk caches, each with a different purpose and performance profile.

_Why don't we do this with our code?_

Cache Tower isn't a single type of cache, its a multi-layer solution to caching with each layer on top of another.
A multi-layer cache provides the performance benefits of a fast cache like in-memory with the resilience of a file, database or Redis-backed cache.

This library was inspired by a blog post by Nick Craver about [how Stack Overflow do caching](https://nickcraver.com/blog/2019/08/06/stack-overflow-how-we-do-app-caching/).
Stack Overflow use a custom 2-layer caching solution with in-memory and Redis.

## ๐Ÿ“‹ Features

- High performance with low allocations ([see comparison to other caching solutions](/docs/Comparison.md)).
- Local system caching with [in-memory](#MemoryCacheLayer) and [file-based caches](#JsonFileCacheLayer).
- Distributed system caching with [MongoDB](#MongoDbCacheLayer) and [Redis](#RedisCacheLayer).
- Supports one or more cache layers, [allowing a cache that has the best of all worlds](#Understanding-a-Multi-Layer-Caching-System).
- [Background refreshes of non-expired but "stale" data](#Background-Refreshing-of-Stale-Data), helping avoid expired data cache misses.
- Local refresh locking, guaranteeing only 1 factory call per key locally.
- [Distributed refresh locking](#Distributed-Locking-via-Redis), guaranteeing only 1 factory call per key across multiple application instances.
- [Distributed evictions](#Distributed-Eviction-via-Redis), helping to keep caches across multiple application instances the same.
- All-async API, ready for high performance workloads.
- [Targets minimum .NET Standard 2.0 for wide compatibility (.NET Framework 4.6.1+, .NET Core 2.0+, .NET 5.0+)](https://docs.microsoft.com/en-us/dotnet/standard/net-standard#net-implementation-support).

## ๐Ÿค Licensing and Support

Cache Tower is licensed under the MIT license. It is free to use in personal and commercial projects.

There are [support plans](https://turnersoftware.com.au/support-plans) available that cover all active [Turner Software OSS projects](https://github.com/TurnerSoftware).
Support plans provide private email support, expert usage advice for our projects, priority bug fixes and more.
These support plans help fund our OSS commitments to provide better software for everyone.

## ๐Ÿ“– Table of Contents

- [Installation](#installation)
- [Understanding a Multi-Layer Caching System](#understanding-multi-layer-caching)
- [The Cache Layers of Cache Tower](#official-cache-layers)
- [Making Your Own Cache Layer](#custom-cache-layers)
- [Cache Serializers](#cache-serializers)
- [Getting Started](#getting-started)
- [Background Refreshing of Stale Data](#background-refreshing)
- [Avoiding Disposed Contexts](#avoiding-disposed-contexts)
- [Cache Tower Extensions](#extensions)
- [Automatic Cleanup](#automatic-cleanup)
- [Distributed Locking via Redis](#distributed-locking-via-redis)
- [Distributed Eviction via Redis](#distributed-eviction-via-redis)
- [Performance and Comparisons](#performance)
- [Advanced Usage](#advanced-usage)
- [Flushing the Cache](#flushing-the-cache)

## ๐Ÿ’ฟ Installation

You will need the `CacheTower` package on NuGet - it provides the core infrastructure for Cache Tower as well as an in-memory cache layer.
To add additional cache layers, you will need to install the appropriate packages as listed below.

| Package | NuGet | Downloads |
| ------- | ----- | --------- |
| [CacheTower](https://www.nuget.org/packages/CacheTower/)
The core library with in-memory and file caching support. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.svg) |
| [CacheTower.Extensions.Redis](https://www.nuget.org/packages/CacheTower.Extensions.Redis/)
Provides distributed locking & eviction via Redis. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Extensions.Redis.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Extensions.Redis.svg) |
| [CacheTower.Providers.Database.MongoDB](https://www.nuget.org/packages/CacheTower.Providers.Database.MongoDB/)
Provides a cache layer for MongoDB. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Providers.Database.MongoDB.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Providers.Database.MongoDB.svg) |
| [CacheTower.Providers.Redis](https://www.nuget.org/packages/CacheTower.Providers.Redis/)
Provides a cache layer for Redis. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Providers.Redis.svg)| ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Providers.Redis.svg) |
| [CacheTower.Serializers.NewtonsoftJson](https://www.nuget.org/packages/CacheTower.Serializers.NewtonsoftJson/)
Provides a JSON serializer using Newtonsoft.Json. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Serializers.NewtonsoftJson.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Serializers.NewtonsoftJson.svg) |
| [CacheTower.Serializers.SystemTextJson](https://www.nuget.org/packages/CacheTower.Serializers.SystemTextJson/)
Provides a JSON serializer using System.Text.Json. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Serializers.SystemTextJson.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Serializers.SystemTextJson.svg) |
| [CacheTower.Serializers.Protobuf](https://www.nuget.org/packages/CacheTower.Serializers.Protobuf/)
Provides a Protobuf serializer using protobuf-net. | ![NuGet](https://img.shields.io/nuget/v/CacheTower.Serializers.Protobuf.svg) | ![NuGet](https://img.shields.io/nuget/dt/CacheTower.Serializers.Protobuf.svg) |

## ๐ŸŽ“ Understanding a Multi-Layer Caching System

At its most basic level, caching is designed to prevent reprocessing of data by storing the result _somewhere_.
In turn, preventing the reprocessing of data makes our code faster and more scaleable.
Depending on the method of storage or transportation, the performance profile can vary drastically.
Not only that, limitations of different types of caches can affect what you can do with your application.

----

### In-memory Caching

โœ” **Pro**: The fastest cache you can possible have!

โŒ **Con**: Only lasts the lifetime of the application.

โŒ **Con**: Memory capacity is more limited than other types of storage.

### File-based Caching

โœ” **Pro**: Caching huge amounts of data is not just possible, it is usually cheap!

โœ” **Pro**: Resilient to application restarts!

โŒ **Con**: Even with fast SSDs, it can be _1500x slower_ than in-memory!

### Database Caching

โœ” **Pro**: Database can run on the local machine _OR_ a remote machine!

โœ” **Pro**: Resilient to application restarts!

โœ” **Pro**: Can support multiple systems at the same time!

โŒ **Con**: Performance is only as good as the database provider itself. Don't forget network latency either!

### Redis Caching

โœ” **Pro**: Redis can run on the local machine _OR_ a remote machine!

โœ” **Pro**: Resilient to application restarts!

โœ” **Pro**: Can support multiple systems at the same time!

โœ” **Pro**: High performance (faster than file-based, slower than in-memory).

โŒ **Con**: Linux only. *

_* On Windows, [Memurai is your best Redis-compatible alternative](https://www.memurai.com/) - just need to list some sort of con for Redis and what it ran on was all I could think of at the time._

----

An ideal caching solution should be fast, flexible, resilient and scale with your usage.
It is through combining these different cache types that this can be achieved.

Cache Tower supports n-layers of caching with flexibility to even make your own.
You "stack" the cache layers from the fastest to slowest for your particular usage.

For example, you might have:
1. In-memory cache
1. File-based cache

With this setup, you have:
- A fast first-layer cache
- A resilient second-layer cache

If your application restarts and your in-memory cache is empty, your second-layer cache will be checked.
If a valid cache entry is found, that will be returned.

Which combination of cache layers you use to build your cache stack is up to you and what is best for your application.

|โ„น Don't need a multi-layer cache right now? |
|:-|
|Multi-layer caching is only one part of Cache Tower. If you only need one layer of caching, you can still leverage the different types of caches available and take advantage of background refreshing. If later on you need to add more layers, you only need to change the configuration of your cache stack!|

## ๐Ÿข The Cache Layers of Cache Tower

Cache Tower has a number of officially supported cache layers that you can use.

### MemoryCacheLayer

> Bundled with Cache Tower

```csharp
builder.AddMemoryCacheLayer();
```

Allows for fast, local memory caching.
The data is kept as a reference in memory and _not serialized_.
It is strongly recommended to treat the cached instance as immutable.
Modification of an in-memory cached value won't be updated to other cache layers.

### FileCacheLayer

> Bundled with Cache Tower

```csharp
builder.AddFileCacheLayer(new FileCacheLayerOptions("~/", NewtonsoftJsonCacheSerializer.Instance));
```

Provides a basic file-based caching solution using [your choice of serializer](#cache-serializers).
It stores each serialized cache item into its own file and uses a singular manifest file to track the status of the cache.

### MongoDbCacheLayer

```powershell
PM> Install-Package CacheTower.Providers.Database.MongoDB
```

```csharp
builder.AddMongoDbCacheLayer(/* MongoDB Connection */);
```

Allows caching through a MongoDB server.
Cache entries are serialized to BSON using `MongoDB.Bson.Serialization.BsonSerializer`.

### RedisCacheLayer

```powershell
PM> Install-Package CacheTower.Providers.Redis
```

```csharp
builder.AddRedisCacheLayer(/* Redis Connection */, new RedisCacheLayerOptions(ProtobufCacheSerializer.Instance));
```

Allows caching of data in Redis using [your choice of serializer](#cache-serializers).

## โœ Cache Serializers

The `FileCacheLayer` and `RedisCacheLayer` support custom serializers for caching data.
Different serializers have different performance profiles as well as different tradeoffs for configuration.

### NewtonsoftJsonCacheSerializer

```powershell
PM> Install-Package CacheTower.Serializers.NewtonsoftJson
```

Uses [Newtonsoft.Json](https://github.com/JamesNK/Newtonsoft.Json/) to perform serialization.

### SystemTextJsonCacheSerializer

```powershell
PM> Install-Package CacheTower.Serializers.SystemTextJson
```

Uses [System.Text.Json](https://www.nuget.org/packages/System.Text.Json) to perform serialization.

### ProtobufCacheSerializer

```powershell
PM> Install-Package CacheTower.Serializers.Protobuf
```

The use of [protobuf-net requires decorating the class](https://github.com/protobuf-net/protobuf-net#1-first-decorate-your-classes) you want to cache with attributes `[ProtoContract]` and `[ProtoMember]`.

**Example with Protobuf Attributes**
```csharp
[ProtoContract]
public class UserProfile
{
[ProtoMember(1)]
public int UserId { get; set; }
[ProtoMember(2)]
public string UserName { get; set; }

...
}
```

Additionally, as the Protobuf format doesn't have a way to represent an empty collection, these will be returned as `null`.
While this can be inconvienent, using Protobuf ensures high performance and low allocations for serializing.

## ๐Ÿ”จ Making Your Own Cache Layer

You can create your own cache layer by implementing [`ICacheLayer`](src/CacheTower/ICacheLayer.cs).
With it, you could implement caching layers that talk to SQL databases or cloud-based storage systems.

When making your own cache layer, you will need to keep in mind that your implementation should be thread safe.
Cache Stack prevents multiple threads at once calling the value factory, not preventing multiple threads accessing the cache layer.

## โญ Getting Started

> In this example, `UserContext` is a type added to the service collection.
It will be retrieved from the service provider every time a cache refresh is required.

Create and configure your `CacheStack`, this is the backbone for Cache Tower.

```csharp
services.AddCacheStack((provider, builder) => builder
.AddMemoryCacheLayer()
.AddRedisCacheLayer(/* Your Redis Connection */, new RedisCacheLayerOptions(ProtobufCacheSerializer.Instance))
.WithCleanupFrequency(TimeSpan.FromMinutes(5))
);
```

The cache stack will be injected into constructors that accept `ICacheStack`.
Once you have your cache stack, you can call `GetOrSetAsync` - this is the primary way to access the data in the cache.

```csharp
var userId = 17;

await cacheStack.GetOrSetAsync($"user-{userId}", async (old, context) => {
return await context.GetUserForIdAsync(userId);
}, new CacheSettings(TimeSpan.FromDays(1), TimeSpan.FromMinutes(60));
```

This call to `GetOrSetAsync` is configured with a cache expiry of `1 day` and an effective stale time after `60 minutes`.
A good stale time is extremely useful for high performance scenarios where background refreshing is leveraged.

## ๐Ÿ” Background Refreshing of Stale Data

A high-performance cache needs to keep throughput high.
Having a cache miss because of expired data stalls the potential throughput.

Rather than only having a cache expiry, Cache Tower supports specifying a stale time for the cache entry.
If there is a cache hit on an item and the item is considered stale, it will perform a background refresh.
By doing this, it avoids blocking the request on a potential cache miss later.

```csharp
await cacheStack.GetOrSetAsync("my-cache-key", async (oldValue) => {
return await DoWorkThatNeedsToBeCachedAsync();
}, new CacheSettings(timeToLive: TimeSpan.FromMinutes(60), staleAfter: TimeSpan.FromMinutes(30)));
```

In the example above, the cache would expire in 60 minutes time (`timeToLive`).
However, in 30 minutes, the cache will be considered stale (`staleAfter`).

### Example Flow of Background Refreshing

- You request an item from the cache
- No entry is found (cache miss)
- Your value factory is called
- The value is cached and returned
- You request the item again later (after the `staleAfter` time but before `timeToLive`)
- The non-expired entry is found
- It is checked if it is stale (it is)
- A background refresh is started
- The non-expired (stale) entry is returned
- You request the item again later (after the background refresh has finished)
- The non-expired entry is found
- It is checked if it is stale (it isn't)
- The non-expired non-stale entry is returned

### Picking A Good Stale Time

There is no one-size-fits-all `staleAfter` value - it will depend on what you're caching and why.
That said, a reasonable rule of thumb would be to have a stale time no less than half of the `timeToLive`.

The shorter you make the `staleAfter` value, the more frequent background refreshing will happen.

โš  **Warning: Avoid setting a stale time that is too short!**

This is called _"over refreshing"_ whereby the background refreshing happens far more frequently than is useful.
Over refreshing is at its worse with stale times shorter than a few minutes for cache entries that are frequently hit.

This has two effects:
1. Frequent refreshes would increase load on the factory that provides the data to cache, potentially degrading its performance.
2. Background refreshing, while efficient, has a non-zero cost when invoked thus putting additional pressure on the application where they are triggering.

With this in mind, it is not advised to set your `staleAfter` time to 0.
This effectively means the cache is always stale, performing a background refresh every hit of the cache.

### Avoiding Disposed Contexts

With stale refreshes happening in the background, it is important to not reference potentially disposed objects and contexts.
Cache Tower can help with this by providing a context into the `GetOrSetAsync` method.

```csharp
await cacheStack.GetOrSetAsync("my-cache-key", async (oldValue, context) => {
return await DoWorkThatNeedsToBeCachedAsync(context);
}, new CacheSettings(timeToLive: TimeSpan.FromMinutes(60), staleAfter: TimeSpan.FromMinutes(30)));
```

The type of `context` is established at the time of configuring the cache stack.

```csharp
services.AddCacheStack((provider, builder) => builder
.AddMemoryCacheLayer()
.WithCleanupFrequency(TimeSpan.FromMinutes(5))
);
```

Cache Tower will resolve the context from the same service collection the `AddCacheStack` call was added to.
A scope will be created and context resolved every time there is a cache refresh.

You can use this context to hold any of the other objects or properties you need for safe access in a background thread, avoiding the possibility of accessing disposed objects like database connections.

|โ„น Need a custom context resolving solution? |
|:-|
|You can specify your own context activator via `builder.CacheContextActivator` by implementing a custom `ICacheContextActivator`. To see a complete example, see [this integration for SimpleInjector](https://github.com/mgoodfellow/CacheTower.ContextActivators.SimpleInjector)|

## ๐Ÿท Named Cache Stacks

You might not always want a single large `CacheStack` shared between all your code - perhaps you want an in-memory cache with a Redis layer for one section and a file cache for another.
Cache Tower supports named `CacheStack` implementations via `ICacheStackAccessor`/`ICacheStackAccessor`.

This follows a similar pattern to how `IHttpClientFactory` works, allowing you to fetch the specific `CacheStack` implementation you want within your own class.

```csharp
services.AddCacheStack("MyAwesomeCacheStack", (provider, builder) => builder
.AddMemoryCacheLayer()
.WithCleanupFrequency(TimeSpan.FromMinutes(5))
);

public class MyController
{
private readonly ICacheStack cacheStack;

public MyController(ICacheStackAccessor cacheStackAccessor)
{
cacheStack = cacheStackAccessor.GetCacheStack("MyAwesomeCacheStack");
}
}
```

## ๐Ÿ— Cache Tower Extensions

To allow more flexibility, Cache Tower uses an extension system to enhance functionality.
Some of these extensions rely on third party libraries and software to function correctly.

### Automatic Cleanup

> Bundled with Cache Tower

```csharp
builder.WithCleanupFrequency(TimeSpan.FromMinutes(5));
```

The cache layers themselves, for the most part, don't directly manage the co-ordination of when they need to delete expired data.
While the `RedisCacheLayer` does handle cache expiration directly via Redis, none of the other official cache layers do.
Unless you are only using the Redis cache layer, you will be wanting to include this extension in your cache stack.

### Distributed Locking via Redis

```powershell
PM> Install-Package CacheTower.Extensions.Redis
```

```csharp
builder.WithRedisDistributedLocking(/* Your Redis connection */);
```

The `RedisLockExtension` uses Redis as a shared lock between multiple instances of your application.
Using Redis in this way can avoid cache stampedes where multiple different web servers are refreshing values at the same instant.

If you are only running one web server/instance of your application, you won't need this extension.

### Distributed Eviction via Redis

```powershell
PM> Install-Package CacheTower.Extensions.Redis
```

```csharp
builder.WithRedisRemoteEviction(/* Your Redis connection */);
```

The `RedisRemoteEvictionExtension` extension uses the pub/sub feature of Redis to co-ordinate cache invalidation across multiple instances of your application.
This works in the situation where one web server has refreshed a key and wants to let the other web servers know their data is now old.

## ๐Ÿฅ‡ Performance and Comparisons

Cache Tower has been built from the ground up for high performance and low memory consumption.
Across a number of benchmarks against other caching solutions, Cache Tower performs similarly or better than the competition.

Where Cache Tower makes up in speed, it may lack a variety of features common amongst other caching solutions.
It is important to weigh both the feature set and performance when deciding on a caching solution.

[Performance Comparisons to Cache Tower Alternatives](/docs/Comparison.md)

## ๐Ÿงช Advanced Usage

### Flushing the Cache

There are times where you want to clear all cache layers - whether to help with debugging an issue or force fresh data on subsequent calls to the cache.
This type of action is available in Cache Tower however is obfuscated somewhat to prevent accidental use.
Please only flush the cache if you know what you're doing and what it would mean!

If you have injected `ICacheStack` or `ICacheStack` into your current method or class, you can cast to `IFlushableCacheStack`.
This interface exposes the method `FlushAsync`.

```csharp
await (myCacheStack as IFlushableCacheStack).FlushAsync();
```

For the `MemoryCacheLayer`, the backing store is cleared.
For file cache layers, all cache files are removed.
For MongoDB, all documents are deleted in the cache collection.
For Redis, a `FlushDB` command is sent.

Combined with the `RedisRemoteEvictionExtension`, a call to `FlushAsync` will additionally be sent to all connected `CacheStack` instances.