Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/pandatecham/be-lib-distributed-cache
A robust caching solution for .NET applications using Redis, with support for flexible cache configuration, tagging, and health checks.
https://github.com/pandatecham/be-lib-distributed-cache
cache distributed-lock dotnet library nuget package pandatech redis
Last synced: 4 months ago
JSON representation
A robust caching solution for .NET applications using Redis, with support for flexible cache configuration, tagging, and health checks.
- Host: GitHub
- URL: https://github.com/pandatecham/be-lib-distributed-cache
- Owner: PandaTechAM
- License: mit
- Created: 2024-06-05T09:51:31.000Z (8 months ago)
- Default Branch: development
- Last Pushed: 2024-09-05T20:25:33.000Z (5 months ago)
- Last Synced: 2024-10-14T03:02:47.538Z (4 months ago)
- Topics: cache, distributed-lock, dotnet, library, nuget, package, pandatech, redis
- Language: C#
- Homepage:
- Size: 191 KB
- Stars: 1
- Watchers: 0
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: Readme.md
- License: LICENSE.txt
Awesome Lists containing this project
README
# Pandatech.DistributedCache
Pandatech.DistributedCache is a .NET library providing an efficient and performant abstraction layer over
`StackExchange.Redis`, specifically designed for .NET applications. This library builds on top of
`StackExchange.Redis.Extensions.AspNetCore` and `StackExchange.Redis.Extensions.MsgPack` to offer a robust, easy-to-use
caching solution with advanced features such as typed cache services, distributed locking, business logic rate limiting.## Features
- **Typed Cache Service:** Supports strongly-typed caching with MessagePack serialization.
- **Distributed Locking:** Ensures data consistency with distributed locks.
- **Distributed Rate Limiting:** Prevents cache abuse with rate limiting based on business logic.
- **Key Isolation:** Modular monolith support by prefixing keys with assembly names.
- **Stampede Protection:** Protects against cache stampede in the `GetOrCreateAsync` method.
- **No Serializer Override:** Enforces MessagePack serialization for performance and readability.## Installation
Add `Pandatech.DistributedCache` to your project using NuGet:
```bash
dotnet add package Pandatech.DistributedCache
```## Usage
### 1. Configuration
In your `Program.cs`, configure the cache service:
```csharp
var builder = WebApplication.CreateBuilder(args);builder.AddDistributedCache(options =>
{
options.RedisConnectionString = "your_redis_connection_string"; //No default value and required
options.ConnectRetry = 15; //Default is 10
options.ConnectTimeout = TimeSpan.FromSeconds(10); //Default is 10 seconds
options.SyncTimeout = TimeSpan.FromSeconds(5); //Default is 5 seconds
options.DistributedLockDuration = TimeSpan.FromSeconds(5); //Default is 5 seconds
options.DefaultExpiration = TimeSpan.FromMinutes(5); //Default is 15 minutes
});var app = builder.Build();
```#### Advanced Configuration
**Key Prefix for Isolation**
To ensure module-level isolation in modular monoliths, use the `KeyPrefixForIsolation` setting. This will not allow
cross ClassLibrary cache access.```csharp
options.KeyPrefixForIsolation = KeyPrefix.AssemblyNamePrefix;
```**Note:** Even if you don't use key prefixing, you still need to provide the class as a generic type (`T`) when using
`IRateLimitService`. The generic type `T` is used to retrieve the assembly name, which is important for key isolation. If
you choose not to prefix keys by assembly name, this type is still required but will be ignored in the actual
implementation.### 2. Cached Entity Preparation
Create your cache entity/model in order to inject it in the actual service:
```csharp
[MessagePackObject]
public class TestCacheEntity : ICacheEntity
{
[Key(0)] public string Name { get; set; } = "Bob";
[Key(1)] public int Age { get; set; } = 15;
[Key(2)] public DateTime CreatedAt { get; set; } = DateTime.Now;
}
```### 3. Injecting ICacheService
Use `ICacheService` in your services to interact with the cache:
```csharp
public class CacheTestsService(ICacheService cacheService)
{
public async Task GetFromCache(CancellationToken token = default)
{
await cacheService.GetOrCreateAsync("test",
async _ => await GetFromPostgres(token),
TimeSpan.FromMinutes(1),
["test"],
token);await cacheService.GetOrCreateAsync("test2",
async _ => await GetFromPostgres(token),
TimeSpan.FromMinutes(1),
["vazgen"],
token);
await cacheService.GetOrCreateAsync("test3",
async _ => await GetFromPostgres(token),
TimeSpan.FromMinutes(1),
["test", "vazgen"],
token);
}public async Task DeleteCache(CancellationToken token = default)
{
await cacheService.RemoveByTagAsync("test", token);
}public async Task GetFromPostgres(CancellationToken token)
{
Console.WriteLine("Fetching from PostgreSQL");
await Task.Delay(500, token);
return new TestCacheEntity();
}
}
```### 4. Interface Methods
```csharp
namespace DistributedCache.Services.Interfaces;///
/// Interface for cache service operations.
///
/// The type of the cache entity.
public interface ICacheService where T : class
{
///
/// Gets or creates a cache entry asynchronously.
///
/// The key of the cache entry.
/// A factory function to create the cache entry if it does not exist.
/// Optional expiration time for the cache entry.
/// Optional tags associated with the cache entry.
/// Cancellation token.
/// A task representing the asynchronous operation, with the cache entry as the result.
ValueTask GetOrCreateAsync(string key, Func> factory,
TimeSpan? expiration = null, IReadOnlyCollection? tags = null, CancellationToken token = default);///
/// Gets a cache entry asynchronously.
///
/// The key of the cache entry.
/// Cancellation token.
/// A task representing the asynchronous operation, with the cache entry as the result if found; otherwise, null.
ValueTask GetAsync(string key, CancellationToken token = default);///
/// Sets a cache entry asynchronously.
///
/// The key of the cache entry.
/// The value of the cache entry.
/// Optional expiration time for the cache entry.
/// Optional tags associated with the cache entry.
/// Cancellation token.
/// A task representing the asynchronous operation.
ValueTask SetAsync(string key, T value, TimeSpan? expiration = null, IReadOnlyCollection? tags = null,
CancellationToken token = default);///
/// Removes a cache entry by key asynchronously.
///
/// The key of the cache entry to remove.
/// Cancellation token.
/// A task representing the asynchronous operation.
ValueTask RemoveByKeyAsync(string key, CancellationToken token = default);///
/// Removes multiple cache entries by their keys asynchronously.
///
/// The keys of the cache entries to remove.
/// Cancellation token.
/// A task representing the asynchronous operation.
ValueTask RemoveByKeysAsync(IEnumerable keys, CancellationToken token = default);///
/// Removes cache entries associated with a tag asynchronously.
///
/// The tag associated with the cache entries to remove.
/// Cancellation token.
/// A task representing the asynchronous operation.
///
/// If multiple tags are specified, any entry matching any one of the tags will be removed. This means tags are treated as an "OR" condition.
///
ValueTask RemoveByTagAsync(string tag, CancellationToken token = default);///
/// Removes cache entries associated with multiple tags asynchronously.
///
/// The tags associated with the cache entries to remove.
/// Cancellation token.
/// A task representing the asynchronous operation.
///
/// If multiple tags are specified, any entry matching any one of the tags will be removed. This means tags are treated as an "OR" condition.
///
ValueTask RemoveByTagsAsync(IEnumerable tags, CancellationToken token = default);
}
```### 5. Rate Limiting
Implement rate limiting using `IRateLimitService` and `RateLimitConfiguration`.
**Define Rate Limiting Configuration**
```csharp
public enum ActionType //your business logic actions
{
SmsForTfa = 1,
EmailForTfa = 2
}public static class RateLimitingConfigurations //your shared rate limiting configuration
{
public static RateLimitConfiguration GetSmsConfig()
{
return new RateLimitConfiguration
{
ActionType = (int)ActionType.SmsForTfa,
MaxAttempts = 2,
TimeToLive = TimeSpan.FromSeconds(10)
};
}
}
```**Implement Rate Limiting in the service**
```csharp
using DistributedCache.Dtos;
using DistributedCache.Services.Interfaces;public class SendSmsService(IRateLimitService rateLimitService)
{
public async Task SendSms(CancellationToken cancellationToken = default)
{
var phoneNumber = "1234567890";
var rateLimitConfiguration = RateLimitingConfigurations.GetSmsConfig().SetIdentifiers(phoneNumber);return await rateLimitService.RateLimitAsync(rateLimitConfiguration, cancellationToken);
}
}
```Based on rate limit state you can throw exception/return 427 or proceed with the business logic.
## Enforced MessagePack Serialization
`Pandatech.DistributedCache` enforces the use of MessagePack serialization for several compelling reasons:
1. **Performance:** MessagePack is significantly faster compared to other serialization formats. For example, benchmarks
show that MessagePack can be up to 4 times faster than JSON and 1.5 times faster than Protobuf in terms of
serialization and deserialization speed.
2. **Compact Size:** MessagePack produces smaller payloads, which results in lower memory usage and faster data transfer
over the network. On average, MessagePack serialized data is about 50% smaller than JSON and 20-30% smaller than
Protobuf.
3. **Human Readability in Tools:** Many Redis clients, such as Another Redis Desktop Manager, can display MessagePack
serialized data as JSON, making it easier for developers to inspect and debug the cache content.
4. **Simplicity:** By enforcing a single serialization format, we avoid the complexity and potential issues that can
arise from supporting multiple serializers. This decision simplifies the implementation and ensures consistent
behavior across different parts of the application.Given these benefits, overriding the serializer is not provided as MessagePack meets the performance and usability needs
effectively.**Benchmark Comparison**
-------------------------| Format | Serialization Speed | Deserialization Speed | Serialized Size |
|-------------|-----------------------|-----------------------|-----------------|
| MessagePack | 4x faster than JSON | 3x faster than JSON | ~50% of JSON |
| Protobuf | 1.5x faster than JSON | 1.2x faster than JSON | ~70% of JSON |
| JSON | Baseline | Baseline | Baseline |## Acknowledgements
Inspired by Microsoft's .NET 9 `HybridCache` and leveraging the power of `StackExchange.Redis`. `HybridCache` is in a
preview state and is not recommended for production use. The main difference is that `HybridCache` is too general and
also
uses L1 + L2 caching instead of only L2 caching.When the time comes and `HybridCache` will become stable, mature and feature rich, we will consider migrating to it with
backward compatability.## License
Pandatech.DistributedCache is licensed under the MIT License.