{"id":34906132,"url":"https://github.com/alistanis/cache","last_synced_at":"2025-12-26T10:01:15.390Z","repository":{"id":148613841,"uuid":"480108141","full_name":"alistanis/cache","owner":"alistanis","description":"Generic sharded thread safe LRU cache in Go. ","archived":false,"fork":false,"pushed_at":"2022-04-13T22:00:26.000Z","size":96,"stargazers_count":12,"open_issues_count":2,"forks_count":1,"subscribers_count":3,"default_branch":"main","last_synced_at":"2024-06-21T18:54:07.449Z","etag":null,"topics":["generic","go","golang","sharded","thread-safe-cache"],"latest_commit_sha":null,"homepage":"","language":"Go","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/alistanis.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null}},"created_at":"2022-04-10T18:41:47.000Z","updated_at":"2024-01-29T08:18:31.000Z","dependencies_parsed_at":"2023-05-20T17:15:21.177Z","dependency_job_id":null,"html_url":"https://github.com/alistanis/cache","commit_stats":null,"previous_names":[],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/alistanis/cache","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alistanis%2Fcache","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alistanis%2Fcache/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alistanis%2Fcache/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alistanis%2Fcache/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/alistanis","download_url":"https://codeload.github.com/alistanis/cache/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/alistanis%2Fcache/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28052416,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2022-07-04T15:15:14.044Z","status":"online","status_checked_at":"2025-12-26T02:00:06.189Z","response_time":55,"last_error":null,"robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":true,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["generic","go","golang","sharded","thread-safe-cache"],"created_at":"2025-12-26T10:00:56.467Z","updated_at":"2025-12-26T10:01:15.384Z","avatar_url":"https://github.com/alistanis.png","language":"Go","readme":"# cache\n![Coverage](https://img.shields.io/badge/Coverage-100.0%25-brightgreen)\n\nCache is a thread safe, generic, and sharded in memory LRU cache object. This is achieved by partitioning values across\nmany\nsmaller LRU (least recently used) caches and interacting with those caches over channels.\nEach smaller cache maintains access to its own elements and communicates information back to the Cache object,\nwhich then responds back to the original caller.\n\n# behavior\n\n### LRU backing caches\n\n```go\ntype lruCache[K comparable, V any] struct {\n    table map[K]*list.Node[KVPair[K, V]]\n    list  *list.List[KVPair[K, V]]\n    ...\n    client  *client[K, V]\n    evictFn func (k K, v V)\n}\n```\n\nThe LRU backing caches behave exactly like a normal LRU and are composed of a doubly linked list and a map to allow key\nlookups.\nWhen an entry is added or accessed it is pushed to the front of the list, and when enough items are added to the cache\nthe oldest\nitems(the back of the list) are evicted to make room for more recently used entries. The list implementation itself is a\nported\nversion from the go stdlib which uses type parameters instead of runtime interfaces and can be found in the `/list`\ndirectory.\n\nEach `*lruCache` spawns a single goroutine when `*lruCache.serve(ctx)` is called.\n\n### client\n\n```go\ntype client[K comparable, V any] struct {\n    // GetChannel is a channel for retrieving values from the cache for which this client is associated\n    GetChannel *RequestChannel[Request[K], GetResponse[K, V]]\n    // PutChannel is a channel for placing values into the cache for which this client is associated\n    PutChannel *RequestChannel[Request[KVPair[K, V]], struct{}]\n    ...\n}\n```\n\nThe client abstraction contains a collection of channels by which it communicates with `*lruCache` objects. This allows\neach\n`*lruCache` to run a single goroutine on which it listens for requests over these channels, processes them, and sends\nresponses\nwithout the need for locking anything.\n\n### Cache\n\n```go\ntype Cache[K comparable, V any] struct {\n    caches []*cache[K, V]\n}\n```\nThe Cache is the main interface into the underlying caches.\nThis is the object you want to use if you have many objects\nbeing accessed or mutated across different goroutines. It splits\nany contention across LRU partitions and Go channels instead of using\nMutexes for consistency. The underlying client for each cache is exposed for any\nfine-tuning that must be done; if your workflow ends up pinning a few\nobjects to one particular LRU just because of how the hashing works out and some LRUs are full\nbut never being evicted, you can manually do so for caches that are being underutilized.\nYou can also resize them; manually adding elements to them is not recommended if the top level Cache interface is still being used.\n\n### Note\nAll structures in this package are optimized for alignment.\n\n# examples\n\nFor additional examples, see `cache_test.go` and `cache_benchmark_test.go`\n\n### Single Cache Partition\n\n```go\n\npackage main\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/alistanis/cache\"\n)\n\n// Simple example that is shown with a concurrency of 1 in order to \n// illustrate how the smaller LRU caches work.\nfunc main() {\n\tctx, cancel := context.WithCancel(context.Background())\n\tconcurrency := 1\n\tlruCacheLimit := 5\n\tc := cache.New[int, int](ctx, lruCacheLimit, concurrency)\n\tdefer c.Wait()\n\n\tc.Put(42, 42)\n\tc.Put(1, 1)\n\tc.Get(42)\n\tc.Put(0, 0)\n\tc.Put(2, 2)\n\tc.Put(3, 3)\n\tc.Get(42)\n\t// evict 1\n\tc.Put(4, 4)\n\n\tc.Each(func(key int, val int) {\n\t\tfmt.Println(key)\n\t})\n\n\tcancel()\n\t// Output:\n\t// 4\n\t// 42\n\t// 3\n\t// 2\n\t// 0\n}\n```\n\n### Many Cache Partitions\n\n```go\n\npackage main\n\nimport (\n\t\"context\"\n\t\"fmt\"\n\n\t\"github.com/alistanis/cache\"\n)\n\n// General example for using the Cache object. Since elements are spread across many partitions,\n// order can not be guaranteed, and items will not be evicted in pure LRU terms; it is possible that some partitions\n// may see more traffic than others and may be more eviction heavy, but generally, access patterns amortize evenly.\nfunc main() {\n\n\tctx, cancel := context.WithCancel(context.Background())\n\tconcurrency := 10 // runtime.NumCPU() instead of 10 for actual use\n\tc := cache.New[int, int](ctx, 6, concurrency)\n\tdefer c.Wait()\n\n\tfmt.Println(c.Meta().Len())\n\tfmt.Println(c.Meta().Cap())\n\tfinished := make(chan struct{})\n\tgo func() {\n\t\tfor i := 0; i \u003c 4*concurrency; i++ {\n\t\t\tc.Put(i, i)\n\t\t}\n\t\tfinished \u003c- struct{}{}\n\t}()\n\n\tgo func() {\n\t\tfor i := 8 * concurrency; i \u003e 3*concurrency; i-- {\n\t\t\tc.Put(i, i)\n\t\t}\n\t\tfinished \u003c- struct{}{}\n\t}()\n\n\t\u003c-finished\n\t\u003c-finished\n\n\tfor i := 0; i \u003c 8*concurrency; i++ {\n\t\tv, found := c.Get(i)\n\t\tif !found {\n\t\t\t// get value from backing store\n\t\t\t// res := db.Query(...)\n\t\t\t// v = getValFromRes(res)\n\t\t\t// put value back into cache\n\t\t\t// c.Put(i, v)\n\t\t\tv = 0\n\t\t} else {\n\t\t\tif i != v {\n\t\t\t\tpanic(\"uh oh\")\n\t\t\t}\n\t\t}\n\n\t}\n\n\t// we've put enough values into the cache that 10 partitions are filled with 6 elements each\n\tfmt.Println(c.Meta().Len())\n\tfmt.Println(c.Meta().Cap())\n\t// Output:\n\t// 0\n\t// 60\n\t// 60\n\t// 60\n\tcancel()\n}\n```\n\n### Cache With Eviction Function/Cleanup\n\n```go\npackage main\n\nimport (\n\t\"context\"\n\t\"log\"\n\t\"os\"\n\t\"runtime\"\n\t\"syscall\"\n\n\t\"github.com/alistanis/cache\"\n)\n\nfunc main() {\n\n\tctx, cancel := context.WithCancel(context.Background())\n\terrC := make(chan error)\n\n\t// concurrency/partition of 1 to guarantee LRU order\n\t// size of 1 in order to demonstrate eviction\n\t// Type of cache elements can be inferred by the arguments to the eviction function\n\tc := cache.WithEvictionFunction(ctx, 1, 1, func(s string, f *os.File) {\n\t\t_, err := f.Stat()\n\t\tif err != nil {\n\t\t\terrC \u003c- err\n\t\t\treturn\n\t\t}\n\n\t\tlog.Printf(\"Closing file at path %s, fd: %d\", s, f.Fd())\n\n\t\terrC \u003c- f.Close()\n\t})\n\n\tdefer c.Wait()\n\tdefer cancel()\n\n\td, err := os.MkdirTemp(\"\", \"\")\n\tif err != nil {\n\t\tlog.Fatal(err)\n\t}\n\n\t// cleanup temp resources after main exits\n\tdefer func(path string) {\n\t\terr := os.RemoveAll(path)\n\t\tif err != nil {\n\t\t\tlog.Fatal(err)\n\t\t}\n\n\t}(d)\n\n\t// exit channel that will block until we're \n\t// finished collecting any/all errors\n\texit := make(chan struct{})\n\tgo func() {\n\t\tfor e := range errC {\n\t\t\tif e != nil {\n\t\t\t\tlog.Println(e)\n\t\t\t}\n\t\t}\n\t\t// signal that we're finished and can exit safely\n\t\texit \u003c- struct{}{}\n\t}()\n\n\tf, err := os.CreateTemp(d, \"\")\n\tif err != nil {\n\t\tlog.Println(err)\n\t\treturn\n\t}\n\n\t// first entry on the LRU\n\tc.Put(f.Name(), f)\n\n\tf2, err := os.CreateTemp(d, \"\")\n\tif err != nil {\n\t\tlog.Println(err)\n\t\treturn\n\t}\n\n\t// place f2 in the cache and evict f causing the eviction \n\t// function to fire, closing the file and logging\n\t// 2022/04/13 07:31:47 Closing file at path /var/folders/q3/dt78p91s1b562lmq7qstllv00000gn/T/1705161844/1443821512, fd: 6, inode: 49662131\n\n\tc.Put(f2.Name(), f2)\n\n\t// now forcibly evict f2\n\tevicted := c.Evict()\n\n\t// 2022/04/13 07:31:47 Closing file at path /var/folders/q3/dt78p91s1b562lmq7qstllv00000gn/T/1705161844/767977656, fd: 7, inode: 49662130\n\tlog.Println(evicted) // 1\n\n\tf, err = os.CreateTemp(d, \"\")\n\tif err != nil {\n\t\tlog.Println(err)\n\t\treturn\n\t}\n\n\tc.Put(f.Name(), f)\n\t// Evict f again by resizing\n\tlog.Println(c.Resize(0)) // 1\n\n\t// We're finished so we can close the error channel\n\tclose(errC)\n\t// Wait until errors are processed and exit\n\t\u003c-exit\n}\n```\n\n# Benchmarks\n\n### MacBook Air (M1, 2020), 16GB Ram\n\n```\ngo test -v -benchmem ./... -bench . -run=bench\n\ngoos: darwin\ngoarch: arm64\npkg: github.com/alistanis/cache\nBenchmarkCache_IntInt_SingleThread\nBenchmarkCache_IntInt_SingleThread/Put\nBenchmarkCache_IntInt_SingleThread/Put-8         1442264               824.1 ns/op           110 B/op          6 allocs/op\nBenchmarkCache_IntInt_SingleThread/Get\nBenchmarkCache_IntInt_SingleThread/Get-8         1814316               662.7 ns/op            47 B/op          4 allocs/op\nBenchmarkCache_IntInt_ParallelPut\nBenchmarkCache_IntInt_ParallelPut-8              6645994               183.5 ns/op           110 B/op          6 allocs/op\nBenchmarkCache_IntInt_ParallelGet\nBenchmarkCache_IntInt_ParallelGet-8              8311953               138.8 ns/op            48 B/op          5 allocs/op\nBenchmarkCache_StringString_SingleThread\nBenchmarkCache_StringString_SingleThread/Put\nBenchmarkCache_StringString_SingleThread/Put-8           1000000              1073 ns/op             209 B/op          9 allocs/op\nBenchmarkCache_StringString_SingleThread/Get\nBenchmarkCache_StringString_SingleThread/Get-8           1444695               828.3 ns/op            87 B/op          5 allocs/op\nBenchmarkCache_StringString_ParallelPut\nBenchmarkCache_StringString_ParallelPut-8                4905408               238.7 ns/op           209 B/op          9 allocs/op\nBenchmarkCache_StringString_ParallelGet\nBenchmarkCache_StringString_ParallelGet-8                6977521               170.2 ns/op            88 B/op          6 allocs/op\nPASS\nok      github.com/alistanis/cache      12.475s\nPASS\nok      github.com/alistanis/cache/list 0.093s\n\n```\n\n### MacBook Pro (M1 Pro, 16-inch, 2021), 16GB Ram\n```\ngo test -v -benchmem ./... -bench . -run=bench\ngoos: darwin\ngoarch: arm64\npkg: github.com/alistanis/cache\nBenchmarkCache_IntInt_SingleThread\nBenchmarkCache_IntInt_SingleThread/Put\nBenchmarkCache_IntInt_SingleThread/Put-10        1431043               825.7 ns/op           110 B/op          6 allocs/op\nBenchmarkCache_IntInt_SingleThread/Get\nBenchmarkCache_IntInt_SingleThread/Get-10        1772635               673.3 ns/op            47 B/op          4 allocs/op\nBenchmarkCache_IntInt_ParallelPut\nBenchmarkCache_IntInt_ParallelPut-10             6866359               179.5 ns/op           110 B/op          6 allocs/op\nBenchmarkCache_IntInt_ParallelGet\nBenchmarkCache_IntInt_ParallelGet-10             8667046               138.0 ns/op            48 B/op          5 allocs/op\nBenchmarkCache_StringString_SingleThread\nBenchmarkCache_StringString_SingleThread/Put\nBenchmarkCache_StringString_SingleThread/Put-10                  1000000              1094 ns/op             209 B/op          9 allocs/op\nBenchmarkCache_StringString_SingleThread/Get\nBenchmarkCache_StringString_SingleThread/Get-10                  1455924               822.9 ns/op            87 B/op          5 allocs/op\nBenchmarkCache_StringString_ParallelPut\nBenchmarkCache_StringString_ParallelPut-10                       4883151               280.0 ns/op           209 B/op          9 allocs/op\nBenchmarkCache_StringString_ParallelGet\nBenchmarkCache_StringString_ParallelGet-10                       6611814               190.0 ns/op            88 B/op          6 allocs/op\nPASS\nok      github.com/alistanis/cache      13.023s\nPASS\nok      github.com/alistanis/cache/list 0.156s\n\n```\n\n### MacBook Pro (Intel(R) Core(TM) i9-9980HK CPU @ 2.40GHz, 16-inch, 2019), 64 GB Ram\n```\ngo test -v -benchmem ./... -bench . -run=bench\ngoos: darwin\ngoarch: amd64\npkg: github.com/alistanis/cache\ncpu: Intel(R) Core(TM) i9-9980HK CPU @ 2.40GHz\nBenchmarkCache_IntInt_SingleThread\nBenchmarkCache_IntInt_SingleThread/Put\nBenchmarkCache_IntInt_SingleThread/Put-16         794200              1452 ns/op             110 B/op          6 allocs/op\nBenchmarkCache_IntInt_SingleThread/Get\nBenchmarkCache_IntInt_SingleThread/Get-16         976036              1184 ns/op              47 B/op          4 allocs/op\nBenchmarkCache_IntInt_ParallelPut\nBenchmarkCache_IntInt_ParallelPut-16             6931814               177.8 ns/op           111 B/op          6 allocs/op\nBenchmarkCache_IntInt_ParallelGet\nBenchmarkCache_IntInt_ParallelGet-16             8706753               138.9 ns/op            48 B/op          5 allocs/op\nBenchmarkCache_StringString_SingleThread\nBenchmarkCache_StringString_SingleThread/Put\nBenchmarkCache_StringString_SingleThread/Put-16                   586318              2015 ns/op             209 B/op          9 allocs/op\nBenchmarkCache_StringString_SingleThread/Get\nBenchmarkCache_StringString_SingleThread/Get-16                   860658              1434 ns/op              87 B/op          6 allocs/op\nBenchmarkCache_StringString_ParallelPut\nBenchmarkCache_StringString_ParallelPut-16                       5286390               227.2 ns/op           209 B/op          9 allocs/op\nBenchmarkCache_StringString_ParallelGet\nBenchmarkCache_StringString_ParallelGet-16                       6639519               162.4 ns/op            88 B/op          6 allocs/op\nPASS\nok      github.com/alistanis/cache      10.423s\nPASS\nok      github.com/alistanis/cache/list 0.111s\n\n```\n\n### Linux, Intel(R) Core(TM) i7-10750H CPU @ 2.60GHz\n```\ngo test -v -benchmem ./... -bench . -run=bench\ngoos: linux\ngoarch: amd64\npkg: github.com/alistanis/cache\ncpu: Intel(R) Core(TM) i7-10750H CPU @ 2.60GHz\nBenchmarkCache_IntInt_SingleThread\nBenchmarkCache_IntInt_SingleThread/Put\nBenchmarkCache_IntInt_SingleThread/Put-12        1000000              1062 ns/op             110 B/op          6 allocs/op\nBenchmarkCache_IntInt_SingleThread/Get\nBenchmarkCache_IntInt_SingleThread/Get-12        1349502               886.8 ns/op            47 B/op          4 allocs/op\nBenchmarkCache_IntInt_ParallelPut\nBenchmarkCache_IntInt_ParallelPut-12             6455076               197.1 ns/op           110 B/op          6 allocs/op\nBenchmarkCache_IntInt_ParallelGet\nBenchmarkCache_IntInt_ParallelGet-12             6827888               168.3 ns/op            48 B/op          5 allocs/op\nBenchmarkCache_StringString_SingleThread\nBenchmarkCache_StringString_SingleThread/Put\nBenchmarkCache_StringString_SingleThread/Put-12                   842470              1446 ns/op             209 B/op          9 allocs/op\nBenchmarkCache_StringString_SingleThread/Get\nBenchmarkCache_StringString_SingleThread/Get-12                  1000000              1075 ns/op              87 B/op          6 allocs/op\nBenchmarkCache_StringString_ParallelPut\nBenchmarkCache_StringString_ParallelPut-12                       4743643               269.0 ns/op           209 B/op          9 allocs/op\nBenchmarkCache_StringString_ParallelGet\nBenchmarkCache_StringString_ParallelGet-12                       5551136               206.4 ns/op            88 B/op          6 allocs/op\nPASS\nok      github.com/alistanis/cache      11.210s\nPASS\nok      github.com/alistanis/cache/list 0.002s\n```\n\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falistanis%2Fcache","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Falistanis%2Fcache","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Falistanis%2Fcache/lists"}