{"id":16482214,"url":"https://github.com/cajuncoding/lazycachehelpers","last_synced_at":"2025-08-14T04:24:13.987Z","repository":{"id":65591276,"uuid":"181783535","full_name":"cajuncoding/LazyCacheHelpers","owner":"cajuncoding","description":"An easy to use cache implementation for all layers of an application with support for both Sync \u0026 Async Lazy\u003cT\u003e initialization.","archived":false,"fork":false,"pushed_at":"2023-11-16T19:52:43.000Z","size":180,"stargazers_count":2,"open_issues_count":0,"forks_count":1,"subscribers_count":2,"default_branch":"main","last_synced_at":"2025-07-17T10:57:49.031Z","etag":null,"topics":["async","async-await","async-cache","blocking-cache","cache","caching","csharp","csharp-library","in-memory-caching","lazy","lazy-loading","self-populating-cache"],"latest_commit_sha":null,"homepage":"https://github.com/cajuncoding/LazyCacheHelpers","language":"C#","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/cajuncoding.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2019-04-16T23:43:11.000Z","updated_at":"2022-11-16T19:43:07.000Z","dependencies_parsed_at":"2023-11-16T21:13:50.632Z","dependency_job_id":"f7b41f0c-b622-4db7-96a8-080fee675a42","html_url":"https://github.com/cajuncoding/LazyCacheHelpers","commit_stats":null,"previous_names":[],"tags_count":9,"template":false,"template_full_name":null,"purl":"pkg:github/cajuncoding/LazyCacheHelpers","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cajuncoding%2FLazyCacheHelpers","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cajuncoding%2FLazyCacheHelpers/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cajuncoding%2FLazyCacheHelpers/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cajuncoding%2FLazyCacheHelpers/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/cajuncoding","download_url":"https://codeload.github.com/cajuncoding/LazyCacheHelpers/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/cajuncoding%2FLazyCacheHelpers/sbom","host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":270360673,"owners_count":24570760,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-08-14T02:00:10.309Z","response_time":75,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["async","async-await","async-cache","blocking-cache","cache","caching","csharp","csharp-library","in-memory-caching","lazy","lazy-loading","self-populating-cache"],"created_at":"2024-10-11T13:10:01.990Z","updated_at":"2025-08-14T04:24:13.959Z","avatar_url":"https://github.com/cajuncoding.png","language":"C#","readme":"# LazyCacheHelpers\nA very lightweight Library for leveraging the power of Lazy\u003cT\u003e for caching at all layers of an application with support for both \nSync \u0026 Async Lazy operations to maximize server utilization and performance!\n\n### Give Star 🌟\n**If you like this project and/or use it the please give it a Star 🌟 (c'mon it's free, and it'll help others find the project)!**\n\n### [Buy me a Coffee ☕](https://www.buymeacoffee.com/cajuncoding)\n*I'm happy to share with the community, but if you find this useful (e.g for professional use), and are so inclinded,\nthen I do love-me-some-coffee!*\n\n\u003ca href=\"https://www.buymeacoffee.com/cajuncoding\" target=\"_blank\"\u003e\n\u003cimg src=\"https://cdn.buymeacoffee.com/buttons/default-orange.png\" alt=\"Buy Me A Coffee\" height=\"41\" width=\"174\"\u003e\n\u003c/a\u003e \n\n## Overview\nThe use of `Lazy\u0026lt;T\u0026gt;`, for loading/initializing of data, facilitates a self-populating cache (also known as \na blocking cache), so that even if many requests, for the same cached data, are triggered at the exact same \ntime, no more than one thread/request (sync or asycn) will ever perform the work -- dramatically decreasing \nserver utilization under high load.\n\nLazyCacheHelpers also supports changing the underlying Cache Repository with different implementations via \n`ILazyCacheRepository` implementation. But the default implementation using `.NET MemoryCache` is \nimplemented (via default ICacheRepository implementation as LazyDotNetMemoryCacheRepository) \nalong with greatly simplified support for self-populating (Lazy) initialization.\n\nThe default MemoryCache based implementation will work for the vast majority of medium or small projects; but this flexibility allows\nfor migrating to distributed caches and other cache storage mechanisms easier in the future.\n\nTo clarify, what this means is that if many requests for the cached data are submitted at or near the same time \nthen one-and-only-one-call (thread) will execute the long running process while all other requests will immediately benefit from \nthe resulting loaded data immediately, once it is ready. For example, if the long running process takes 3 seconds to complete \nand 10 more requests come in after 2 seconds, then all of the new requests will benefit from the performance of \nthe self-populating/blocking cache after waiting for only 1 second! Yielding higher performance and lower server utilization!\n\nThe importance of the behavior becomes much more valuable as the load increases and espeically for processes\nthat can take exhorbitant amounts of time (10 seconds, 30 seconds, etc.)!\n\nThis library provides a completely ThreadSafe cache with Lazy loading/initialization capability in an easy to use\n implementation that can work at all levels of an application (classes, controllers, repositories, \n data layer, business layer, etc.).\n\n#### LazyCacheConfig Static Class\nA set of static helpers that enable working with configuration in various helpful scenarios much easier. \nIt provides mechanism to read Cache TTL values from configuration safely and easily. It helps when you want\nto implement cache configurations that might fallback to general cache settings if very specific ones aren't defined, etc.\n\nThe key to using these is that you define (one time at app startup) the static configuration reader delegate function\nthat when given a cache key `string` it returns the a given cache TTL seconds (`string`; will be converted to `int` automatically).\nThis allows you to control where config values are read from but don't have to implement all of hte other helpful\nlooping for fallback values, etc. manually.\n\n#### LazyCacheHelpers.ConfigurationManagement\nThis is an extension package for the LazyCacheHelpers Library to provide easy to use bootstrap method that will initialize\nthe LazyCacheConfig class to read cache configuration using ConfigurationManagement.AppSettings.\n\nTo use the ConfigurationManager you just have to run this, in you application startup, to initialize the config reader func/delegate:\n`LazyCacheConfigurationManager.BootstrapConfigurationManager();`\n\n#### Breaking Change for Reading Config values:\nTo improve compatibility the built in helpers for processing AppSettings configuration values and the LazyCacheConfig\nneeds to be initialized in your application startup process (app root) by defining the config reader func which\ncan be easily provided as a lambda to read the config from any source you like:\n```csharp\nLazyCacheConfig.BootstrapConfigValueReader(configKeyName =\u003e {\n\t\n\t... read your config value and return it as a string, or null if not found/undefined ...\n\n});\n```\n\nOR use the ConfigurationManager Bootstrap helper above if you still use (Deprecated) AppSettings files; \nit'll wire up a default reader for you (see below)!\n\n## Release Notes:\n### v1.3.2\n- Add support to specify custom key comparer (e.g. StringComparer.OrdinalIgnoreCase) in LazyStaticInMemoryCache.\n\n### v1.3.1\n- Add support for Self Expiring cache results where the value factory may now return the CachePolicy/Cache TTL/etc. along with the Cache Result.\n  - This is ideal when the cache TTL is not known ahead of time in use cases such as external API results that also return a lifespan for the data \n      such as Auth API Tokens, etc.\n  - This should not be a breaking change as this support was now added via net new interfaces `ILazyCacheHandlerSelfExpiring\u003cTValue\u003e` \n\t  and `ILazySelfExpiringCacheResult\u003cTValue\u003e`.\n  - This can be easily invoked by not passing the CachePolicy initially and instead using the new convenience method(s) to return \n\t  your result from the value factory (e.g. `LazySelfExpiringCacheResult.From(result, secondsTTL)` which will then invoke the new overload \n      as appropriate (*See new Example below*).\n- Added support to now easily inject/bootstrap the DefaultLazyCache static implementation with your own ILazyCacheRepository, eliminating the need to have your own\nStatic implementaiton if you don't want to duplicate it; though encapsulating in your own static facade is usually a good idea.\n- Implemented IDisposable support for existing LazyCacheHandler and LazyCacheRepositories to support better cleanup of resources\n\n### v1.2\n- Add support for Clearing the Lazy Static In-memory Cache\n  - *The wrapper for Lazy caching pattern for sync or async results based on the underlying ConcurrentDictionary\u003cLazy\u003cT\u003e\u003e).*\n\n### v1.1\n - Add support for Clearing the Cache and for getting the Cache Count; implemented for DefaultLazyCache as well as the Static In-memory caches.\n\n### v1.0.4\n- Restored LazyCacheConfig class capabilities for reading values dynamically from Configuration.\n- Added support for Bootstrapping a Configuration Reader Func (Delegate) so that all reading of config values from the keys is completely dynamic now.\n- Updated LazyCacheHelpers.ConfigurationManager library to now provide the Bootstrap method to initialize reading from AppSettings.\n\n### v1.0.3\n- Added new `LazyStaticInMemoryCache\u003c\u003e` class to make dyanimically caching data, that rarely or never changes, in memory \nin a high performance blocking (self-populating) cache using Lazy\u003c\u003e as the backing mechanism. This now works similar to\na normal `ConcurrentDictionary` but automatically wraps and unwraps all delegates in a Lazy\u003c\u003e to greatly simplify and enalbe\nthe use of this pattern more often and in more places.\n   - It contains support for both Sync and Async value factories for those expensive I/O calls.\n   - Examples use cases that benefit greatly from this are the often very expensive logic that loads data from \n   Reflection or reading values/configuration from Annotation Attributes; this helps mitigate the impact to runtime execution. \n\n### v1.0.2\n- Refactored as .Net Standard v2.0 compatible Library for greater compatibility.\n- Removed dependency on `System.ConfigurationManagement`; Because of this there is a **breaking change** in the LazyCachePolicy helper overloads that dynamically read values from AppSettings; but it is isolated to those helper overlaods only. \n  - The helpers can be restored by adding LazyCacheHelpers.ConfigurationManagement extension package and renaming all calls to LazyCachePolicy static helper to now use LazyCachePolicyFromConfig static helper.\n- Now fully supported as a .Net Standard 2.0 library (sans Configuration reader helpers) whereby you can specify the timespan directly for Cache Policy initialization.\n\n### v1.0.0.1\n - Initial nuget release for .Net Framework.\n\n## Nuget Package\nTo use behaviors in your project, add the [LazyCacheHelpers NuGet package](https://www.nuget.org/packages/LazyCacheHelpers/) to your project.\n\nTo use the powerful helpers for dynamically reading Cache Policy TTL values from AppSettings, add the [LazyCacheHelpers.ConfigurationManagement NuGet package](https://www.nuget.org/packages/LazyCacheHelpers.ConfigurationManager/) to your project.\n - Extension package for the LazyCacheHelpers Library to provide easy to use helpers to read cache configuration\nvalues from App.Config or Web.config files using System.Configuration; making things like enabling/disabling and dynamic fallback from specialized to generalized config values much easier to implement.\n\n## Usage of LazyCache\u003c\u003e with Cache Policies for Data that changes:\nIt's as easy as . . .\n\n_**Cache Keys as Strings vs Objects:**_\n\n - *NOTE:* The following examples are very simple and therefore use Strings to construct cache keys. However, this\ncan quickly result in duplication of string values. While you can implement your own helpers to create the keys\nand eliminate the duplication, it's a good practice to have `Cache Params` objects that implement the \n`ILazyCacheKey` (means it can generate a key) and the `ILazyCachePolicy` interfaces (means it can generate a cache policy);\nit's common that the same cache params class would implement both and be passed in for both params.\n\n\n### Synchronous Caching Example:\n```csharp\nprivate static readonly TimeSpan CacheFiveMinutesTTL = TimeSpan.FromMinutes(5);\n\nfunction ComplexData GetComplexData(string variable)\n{\n\treturn DefaultLazyCache.GetOrAddFromCache($\"CacheKey::{variable}\", \n\t\t() =\u003e {\n\t\t\t//Do any work you want here, or call other services/helpers/etc...\n\t\t\treturn BuildVeryComplexData(variable);\n\t\t},\n\t\tLazyCachePolicy.NewAbsoluteExpirationPolicy(CacheFiveMinutesTTL)\n\t);\n}\n```\n\n### Aynchronous Caching Example:\n```csharp\nprivate static readonly TimeSpan CacheFiveMinutesTTL = TimeSpan.FromMinutes(5);\n\nfunction async Task\u003cComplexData\u003e GetComplexDataAsync(string variable)\n{\n\treturn await DefaultLazyCache.GetOrAddFromCache($\"CacheKey::{variable}\", \n\t\tasync () =\u003e {\n\t\t\t//Do any Async work you want here, or call other services/helpers/etc...\n\t\t\treturn await BuildVeryComplexDataAsync(variable);\n\t\t},\n\t\tLazyCachePolicy.NewAbsoluteExpirationPolicy(CacheFiveMinutesTTL)\n\t);\n}\n```\n\n### Example of Caching \"Self-expiring\" Results:\nWhen the logic to generat the cached value, also results in the optimal cache expiration time, you can now return\nthat along with the cache result to optimize the caching policy for these \"self-expiring\" cache results:\n\nNOTE: The syntax is similar for both sync \u0026 async approaches but this example is Async (as most use cases for this will\nlikely be an async I/O request).\n\n```csharp\nfunction async Task\u003cComplexData\u003e GetComplexDataAsync(string variable)\n{\n\treturn await DefaultLazyCache.GetOrAddFromCache($\"CacheKey::{variable}\", \n\t\tasync () =\u003e {\n\t\t\t//Do any Async work you want here, or call other services/helpers/etc...\n\t\t\tvar complexResult = await BuildVeryComplexDataAsync(variable);\n\n\t\t\t//Assuming that our complex result also knows how long the data is valid for (e.g. it's TTL)\n\t\t\tvar secondsTTL = complexResult.GetDataTTLSeconds();\n\t\t\t\n\t\t\t//Either new up or use the convenience methods to create \u0026 return \n\t\t\t//\ta valid ILazySelfExpiringCacheResult\u003cTValue\u003e...\n\t\t\treturn LazySelfExpiringCacheResult.From(complexResult, secondsTTL);\n\t\t}\n\t);\n}\n```\n\n## Usage of LazyStaticInMemoryCache\u003c\u003e for static (non-expiring) in-memory caching of data that rarely or never changegs:\n\nThe new `LazyStaticInMemoryCache\u003c\u003e` class makes it much easire to implement a lazy loading, blocking, in-memory cache of \ndata that rarely or never changes. Enabling the use of caching patterns much more often with less code to maintain;\nwhile also making the code easier to reason-about.  It also contains support for both Sync and Async value factories\nfor those expensive I/O processes that initialize data that rarely or never changes.\n\nNOTE: The significant difference between this and the above more robus caching feature is that this does not automatically \nprovide for any reclaming of resources by garbage collection, etc. unless manually implemented via `WeakReference` yourself.\n\nNOTE: It supports basic removal, but the `LazyStaticInMemoryCache\u003c\u003e` provides a pattern (of Lazy + ConcurrentDictionary) that is \nbest used for data that never changes once it is loaded/initialized (e.g. Reflection Results, Annotation Attribute cache, etc.).  \nIn almost all cases for data that changes over it's life, the LazyCache\u003c\u003e above with support for cache expiration policy is the \nbetter pattern to use along with it's intrinsic support of garbage collection pressure to reclaim resources.\n\nKey examples of the value of this is the, often expensive, loading of data from Reflection or reading values/configuration \nfrom Annotation Attributes; whereby this pattern migitages negative performance impacts at runtime.\n\n```csharp\npublic class AttributeConfigReader\n{\n\t[AttributeUsage(AttributeTargets.Class)]\n\tprivate class ConfigAttribute : Attribute\n\t{\n\t\t//. . . implement your design time configuration properties . . . \n\t}\n\n\t//By making the cache static it is now a global and thread-safe blocking cache; enabling only \n\t//  one thread ever to complete the work, while any/all other threads/requests can benefit \n\t//  immediatley when the work is completed!\n\t//NOTE: We are able to simply use the Class Type as the cache lookup Key.\n\tprivate static LazyStaticInMemoryCache\u003cType, Attribute\u003e _lazyAttribConfigCache = new LazyStaticInMemoryCache\u003cType, Attribute\u003e();\n\n\tpublic ConfigAttribute ReadConfigAttributeForClass\u003cT\u003e(T classWithConfig) where T : class\n\t{\n\t\t//Now using the static cache is very simple, following the same pattern as a ConcurrentDictionary\u003c\u003e\n\t\t// but without the need to apply the Lazy\u003c\u003e wrapper manually every time the pattern is implemented!\n\t\t//NOTE: The process in the value factory may implement any expensive processing including but not \n\t\t//      limited to the use use Reflection to get access to values, an Attribute, or any \n\t\t//      other expensive operation...\n\t\t//NOTE: Beacuse this is a Lazy loaded blocking cache you don't providee the value, you instead\n\t\t//      provide a value factory method that will be executed to return and initialize the value.\n\t\t//      The key concept here is that the logic will only ever be executed at-most one time, no matter\n\t\t//      how many or how fast multiple (e.g. hundreds/thousands) threads/reqeuests come in for that same data!\n\t\t//NOTE: Exception handling is critical here -- because Lazy\u003c\u003e will cache the Exception -- and \n\t\t//\t\tthis class ensures that exceptions are never cached!\n\t\tvar cacheResult = _lazyAttribConfigCache.GetOrAdd(typeof(T), (typeKey) =\u003e\n\t\t{\n\t\t\t//NOTE: If an Exception occurs then the result will not be cached, only value values\n\t\t\t//      will be cached (e.g. a safe response of null will be cached).\n\t\t\tvar configAttribute = GetConfigAttributeInternal(typeKey);\n\t\t\treturn configAttribute;\n\t\t});\n\n\t\treturn cacheResult;\n\t}\n\n\t//Helper method to make the above code more readable, but as an internal private method\n\t// is not used outside the cache initialization/loading process.\n\t//NOTE: This is the expensive work that we would not want to run on every call to get a Classes\n\t//      configuration due to the Reflection invocations at runtime.\n\tprivate ConfigAttribute GetConfigAttributeInternal(Type classType)\n\t{\n\t\tvar configAttribute = Attribute.GetCustomAttribute(classType, typeof(ConfigAttribute)) as ConfigAttribute;\n\t\treturn configAttribute;\n\t}\n}\n\n```\n\n\n```\n/*\nMIT License\n\nCopyright (c) 2018\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.\n*/\n```\n","funding_links":["https://www.buymeacoffee.com/cajuncoding"],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcajuncoding%2Flazycachehelpers","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fcajuncoding%2Flazycachehelpers","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fcajuncoding%2Flazycachehelpers/lists"}