Ecosyste.ms: Awesome
An open API service indexing awesome lists of open source software.
https://github.com/maraisr/dldr
🧩 A tiny/fast dataloader implementation
https://github.com/maraisr/dldr
batch dataloader
Last synced: 15 days ago
JSON representation
🧩 A tiny/fast dataloader implementation
- Host: GitHub
- URL: https://github.com/maraisr/dldr
- Owner: maraisr
- License: mit
- Created: 2023-03-13T05:01:11.000Z (over 1 year ago)
- Default Branch: main
- Last Pushed: 2024-10-20T23:36:06.000Z (21 days ago)
- Last Synced: 2024-10-21T03:24:54.964Z (21 days ago)
- Topics: batch, dataloader
- Language: TypeScript
- Homepage:
- Size: 140 KB
- Stars: 29
- Watchers: 3
- Forks: 1
- Open Issues: 0
-
Metadata Files:
- Readme: readme.md
- License: license
Awesome Lists containing this project
README
# dldr [![licenses](https://licenses.dev/b/npm/dldr?style=dark)](https://licenses.dev/npm/dldr)
**A tiny utility for batching and caching operations**
This is free to use software, but if you do like it, consisder supporting me ❤️
[![sponsor me](https://badgen.net/badge/icon/sponsor?icon=github&label&color=gray)](https://github.com/sponsors/maraisr)
[![buy me a coffee](https://badgen.net/badge/icon/buymeacoffee?icon=buymeacoffee&label&color=gray)](https://www.buymeacoffee.com/marais)## ⚙️ Install
- **npm** — available as [`dldr`](https://www.npmjs.com/package/dldr)
- **JSR** — available as [`@mr/dataloader`](https://jsr.io/@mr/dataloader)## 🚀 Usage
The default module will batch calls to your provided `loadFn` witin the current tick.
Under the hood we schedule a function with
[`queueMicrotask`](https://developer.mozilla.org/en-US/docs/Web/API/queueMicrotask). That then calls
your `loadFn` with the unique keys that have been requested.```ts
import { load } from 'dldr';// ⬇️ define some arbitary load method that accepts a single argument array of keys
const getPosts = (keys) => sql`SELECT id, name FROM posts WHERE id IN (${keys})`;// .. for convenience, you could bind
const loadPost = load.bind(null, getPosts);// ⬇️ demo some collection that is built up over time.
const posts = [
load(getPosts, '123'),
loadPost('123'), // functionally equivalent to the above
load(getPosts, '456'),
];// ...
posts.push(load(getPosts, '789'));
// ⬇️ batch the load calls, and wait for them to resolve
const loaded = await Promise.all(posts);expect(getPosts).toHaveBeenCalledWith(['123', '456', '789']);
expect(loaded).toEqual([
{ id: '123', name: '123' },
{ id: '123', name: '123' },
{ id: '456', name: '456' },
{ id: '789', name: '789' },
]);
```GraphQL Resolver Example
```ts
import { load } from 'dldr';
import { buildSchema, graphql } from 'graphql';const schema = buildSchema(`
type Query {
me(name: String!): String!
}
`);const operation = `{
a: me(name: "John")
b: me(name: "Jane")
}`;const results = await graphql({
schema,
source: operation,
contextValue: {
getUser: load.bind(null, async (names) => {
// Assume youre calling out to a db or something
const result = names.map((name) => name);// lets pretend this is a promise
return Promise.resolve(result);
}),
},
rootValue: {
me: ({ name }, ctx) => {
return ctx.getUser(name);
},
},
});
```### Caching
Once a key has been loaded, it will be cached for all future calls.
```ts
import { load } from 'dldr/cache';
import { getPosts } from './example';// operates the same as the above, but will cache the results of the load method
const cache = new Map();
const loadPost = load.bind(null, getPosts, cache);
// note; cache is optional, and will be created if not providedconst posts = Promise.all([
load(getPosts, cache, '123'),
loadPost('123'), // will be cached, and functionally equivalent to the above
loadPost('456'),
]);expect(getPosts).toHaveBeenCalledTimes(1);
expect(getPosts).toHaveBeenCalledWith(['123', '456']);
expect(loaded).toEqual([
{ id: '123', name: '123' },
{ id: '123', name: '123' },
{ id: '456', name: '456' },
]);// ⬇️ the cache will be used for subsequent calls
const post = await loadPost('123');expect(getPosts).toHaveBeenCalledTimes(1); // still once
expect(post).toEqual({ id: '123', name: '123' });
```### API
#### Module: `dldr`
The main entry point to start batching your calls.
```ts
function load(
loadFn: (keys: string[]) => Promise<(T | Error)[]>,
key: string,
identityKey?: string,
): Promise;
```> **Note** Might be worth calling `.bind` if you dont want to pass your loader everywhere.
>
> ```js
> const userLoader = load.bind(null, getUsers);
>
> await userLoader('123');
> ```#### Module: `dldr/cache`
A submodule that will cache the results of your `loadFn` between ticks.
```ts
function load(
loadFn: (keys: string[]) => Promise<(T | Error)[]>,
cache: MapLike | undefined,
key: string,
identityKey?: string,
): Promise;
```> A default `Map` based `cache` will be used if you dont provide one.
**_Self managed cache_**
We explicitly do not handle mutations, so if you wish to retrieve fresh entries, or have a primed
cache we recommend you do so yourself. All we require is a `Map` like object.Commonly an LRU cache is used, we recommend [`tmp-cache`](https://github.com/lukeed/tmp-cache).
Example
```ts
import LRU from 'tmp-cache';
import { load } from 'dldr/cache';const loadUser = load.bind(null, getUsers, new LRU(100));
```## License
MIT © [Marais Rossouw](https://marais.io)