https://github.com/gbrlsnchs/filecache
Fast in-memory file caching for Go :zap:
https://github.com/gbrlsnchs/filecache
cache context-aware fast file in-memory in-memory-caching radix-tree
Last synced: about 1 month ago
JSON representation
Fast in-memory file caching for Go :zap:
- Host: GitHub
- URL: https://github.com/gbrlsnchs/filecache
- Owner: gbrlsnchs
- License: mit
- Created: 2018-08-23T23:49:14.000Z (almost 7 years ago)
- Default Branch: master
- Last Pushed: 2018-10-24T18:09:58.000Z (over 6 years ago)
- Last Synced: 2025-02-07T13:54:06.390Z (5 months ago)
- Topics: cache, context-aware, fast, file, in-memory, in-memory-caching, radix-tree
- Language: Go
- Homepage:
- Size: 45.9 KB
- Stars: 3
- Watchers: 3
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGELOG.md
- License: LICENSE
Awesome Lists containing this project
README
# filecache (In-memory file caching using Go)
[](https://travis-ci.org/gbrlsnchs/filecache)
[](https://sourcegraph.com/github.com/gbrlsnchs/filecache?badge)
[](https://godoc.org/github.com/gbrlsnchs/filecache)
[](https://golang.org/doc/go1.10)## About
This package recursively walks through a directory and caches files that match a regexp into a [radix tree](https://en.wikipedia.org/wiki/Radix_tree).Since it spawns one goroutine for each file / directory lookup, it is also context-aware, enabling all the process to return earlier when the context is done.
## Usage
Full documentation [here](https://godoc.org/github.com/gbrlsnchs/filecache).### Installing
#### Go 1.10
`vgo get -u github.com/gbrlsnchs/filecache`
#### Go 1.11 or after
`go get -u github.com/gbrlsnchs/filecache`### Importing
```go
import (
// ..."github.com/gbrlsnchs/filecache"
)
```### Reading all files in a directory
```go
c, err := filecache.ReadDir("foobar", "")
if err != nil {
// If err != nil, directory "foobar" doesn't exist, or maybe one of the files
// inside this directory has been deleted during the reading.
}
txt := c.Get("bazqux.txt")
log.Print(txt)
```### Reading specific files in a directory
```go
c, err := filecache.ReadDir("foobar", `\.sql$`)
if err != nil {
// ...
}
q := c.Get("bazqux.sql")
log.Print(q)
log.Print(c.Len()) // amount of files cached
log.Print(c.Size()) // total size in bytes
```### Lazy-reading a directory
```go
c := filecache.New("foobar")// do stuff...
if err := c.Load(`\.log$`); err != nil {
// ...
}
```### Setting a custom goroutine limit
By default, this package spawns goroutines for each file inside each directory.
Currently, the limit of goroutines is the result of `runtime.NumCPU()`. However, it is possible to use a cache with a custom limit by using the method `SetSemaphoreSize`.
```go
c := filecache.New("foobar")
c.SetSemaphoreSize(100)
if err := c.Load(`\.log$`); err != nil {
// ...
}
```## Contributing
### How to help
- For bugs and opinions, please [open an issue](https://github.com/gbrlsnchs/filecache/issues/new)
- For pushing changes, please [open a pull request](https://github.com/gbrlsnchs/filecache/compare)