batch-boy
Use batch-boy to batch asynchronous function calls. It is a very light weight batching and caching utility that lumps together load requests that occur close together in time.
An implementation of Dataloader.
A major difference between batch-boy and Dataloader.
Installation
npm install batch-boy --save
Usage
const Batcher = const batchingFucntion = async { const data = await database; const dataAsObject = data return keys} const batcher = const funcThatNeedsData = async { //...important logic const itemOne = await batcher //...more logic with itemOne}const anotherFuncThatNeedsData = async { //...stuff const itemTwo = await batcher //...stuff to do with itemTwo} // using the same instance of Batcher to load data into functions that // occur within the same event loop results in only one // call to the database
The above is the typical use case for batch-boy.
Note: Any kind of processing or other logic can occur in the batching function. The only requirements are that it accepts an array of keys and returns a promise that resolves to an array of values.
We are using keys.map
at the return of the function because Batcher
assumes that the values returned correspond to the keys passed in.
Not only that, but corresponding data for a specific key is never guaranteed, so we must control for that by returning some kind of value for keys that dont have corresponding data.
We use data.reduce
before that to place the data into an object so that map can do its thing.
If you want keys that dont get mapped to corresponding data to be refetched next time batcher.load
is called, we could simply assign the key to a falsy value. Otherwise, we give such a key an object or some other truthy value. Then the data can only be refetched with batcher.reload
.
API
batcher.load(key: string | number) : Promise<any>
const user = await batcher
Returns a promise for a value.
batcher.loadMany(keys: key[]) : Promise<any[]>
const users = await batcher
Returns a promise for an array of values.
batcher.prime(key: string | number, value: any) : Promise<any>
const oj billy = await batcherByUsernamebatcherByUserIdbatcherByUserId//...const user2 = await batcherByUserId //billy is already there!
Primes the cache of the batcher instance with the key and value and returns a promise for that value.
batcher.getFromCache(key: string | number) : Promise<any> | null
const dataFromCache = await batcher
Returns a promise for a value in the batcher's cache. Returns null if a value for the provided key is not found or is falsy. Does not refetch the data.
batcher.reload(key: string | number) : Promise<any>
const refetchedItem = await batcher
Refetches data that is in the cache. Return a promise for said data.
batcher.reloadMany(keys: key[]) : Promise<any[]>
const refetchedItems = await batcher
Accepts an array of keys and reloads many.
batcher.clearCache() : Batcher
batcher
Clears the cache of the batcher. Returns the batcher for method chaining.
batcher.clearKey(key) : Batcher
batcher
Clears a specific key from the batcher. Returns the batcher for method chaining.
batcher.clearKeys(keys: key[]) : Batcher
batcher
Clears multiple keys from the batcher. Returns the batcher for method chaining.
Options
options.ongoingJobsEnableQueueing
const batcher = ongoingJobsEnableQueueing: false
Default true
. Calling with false makes it so that the batcher does not wait for the previous job to finish before dispatching the next job.
See below for in-depth explanation.
options.shouldBatch
const batcher = shouldBatch: false
Default true
. If false
, the batcher will NOT batch calls to the load methods.
This will also disable ongoingJobsEnableQueueing
behavior.
options.shouldBatch
const batcher = shouldCache: false
Default true
. If false
, the batcher will NOT cache calls to the load methods.
Patterns
It is suggested that Batcher
is used on a per request basis, because caching data at an application level can have problematic effects if unmanaged.
However, it is possible to use only one instance of each batcher if shouldCache
is set to false or if only the reload functions are used. This would allow for the possibility of using batch-boy for batching data fetching only, and not caching.
The reload methods were something I thought were missing from the Dataloader API as a direct way to refetch data already in the cache. The reload methods are primarily intended for cases in which it is known that data has changed or will change:
const user = await batcherawait db // somewhere else in our codeconst updatedUser = await batcher //updated user!
It is worth noting that this can also be achieved by calling batcher.clearKey
before a batcher.load
. But why write extra code when this convenient API has got you covered...
A major difference from dataloader
While one batch is being processed, by default, requests for another batch on the same batcher will not be run until the previous batch has returned.
Notice that while another batch is being processed, batcher queues calls to batcher.load
that occur within the timeframe of the currently executing process, not just during the same event loop.
This behavior may be sometimes undesired, so control is given to the developer with batcher.xoptions.ongoingJobsEnableQueuing
. By default, this is true
, resulting in the aforementioned behavior, but calling this method with false
will result in a similar execution pattern to Dataloader's.
Tests run demonstrating this behavior: