node-async-decorators
TypeScript icon, indicating that this package has built-in type declarations

0.2.7 • Public • Published

Stargazers MIT License npm version Contributors PRs Welcome Downloads


ts tested with jest eslint


Async decorators for batching, caching, or concurrency control.
Report Bug · Request Feature


Table of Contents



Getting Started

npm install -S node-async-decorators

(back to top)

Batchfy

Batching is the pattern for avoiding to call the same process until the first call has been resolved. All the next calls will be included in a pool and will be resolved all together when the first call has been finished. Each pool will have a unique identifier generated by the context of the call. All the calls with the same context (same pool) will be included in the same batch process.


Example

import { batchfy } from "node-async-decorators";
const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};
const batchSum = batchfy(sum);

batchSum(3, 5).then((result) => {
  /*8*/
}); // call real sum
batchSum(3, 5).then((result) => {
  /*8*/
}); // wait to real sum result
batchSum(4, 7).then((result) => {
  /*11*/
}); // call real sum again because is other context
// ...
batchSum(3, 5).then((result) => {
  /*8*/
}); // call real sum again because the first call has finished

Usage

Use batchfy directly with the default configuration.

import { batchfy, batchfyObject } from "node-async-decorators";

const myBatchedAsyncFunc = batchfy(myAsyncFunc);

batchfyObject(myInstance, "myInstanceAsyncMethod"); //modifies 'myInstance'.

Use batchfy directly applying custom options.
import { batchfy, batchfyObject } from "node-async-decorators";

const myBatchedAsyncFunc = batchfy(myAsyncFunc, options);

batchfyObject(myInstance, "myInstanceAsyncMethod", options); //modifies 'myInstance'.

Use batchfy modifying the the default configuration.
import { batchWithDefaultOptions } from "node-async-decorators";

const { batchfy, batchfyObject } = batchWithDefaultOptions(defaultOptions);

const myBatchedAsyncFunc = batchfy(myAsyncFunc, options);

batchfyObject(myInstance, "myInstanceAsyncMethod", options); //modifies 'myInstance'.

By default, this is the default configuration.
import { BatchOptions } from "node-async-decorators";

const defaultOptions: BatchOptions = {
  /**
   * Promises' storage.
   */
  storage: (): BatchStorage => {
    return new LocalBatchStorage();
  },

  /**
   * In case of an error couldn't be raised, this is the handler.
   */
  onError: (error: unknown) => {
    console.error(error);
  },

  /**
   * By default, all the parameters inputed in the original async function will be taken to identify a unique resquest.
   */
  context: (params: BatchInput): Context => {
    return params;
  },

  /**
   * Predefined function to generate the unique request identifier.
   */
  contextKey: (context: Context): Key => {
    return hash(context);
  },
};

(back to top)

Cachefy

Caching is the pattern for avoiding to call the same process until the stored result has expired. Each call will have a unique identifier generated by the context of the call. All the calls with the same context will receive the same result generated by the first call in that context.


Example

import { cachefy } from "node-async-decorators";
const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};
const cacheSum = cachefy(sum, { ttl: 10000 });

cacheSum(3, 5).then((result) => {
  /*8*/
}); // call real sum
cacheSum(3, 5).then((result) => {
  /*8*/
}); // wait to real sum result
cacheSum(4, 7).then((result) => {
  /*11*/
}); // call real sum again because is other context
// ...
cacheSum(3, 5).then((result) => {
  /*8*/
}); // get the result from the cache storage

Usage

Use cachefy directly with the default configuration.

import { cachefy, cachefyObject } from "node-async-decorators";

const myCachedAsyncFunc = cachefy(myAsyncFunc, { ttl: 1000 });

cachefyObject(myInstance, "myInstanceAsyncMethod", { ttl: 1000 }); //modifies 'myInstance'.

Use cachefy directly applying custom options.
import { cachefy, cachefyObject } from "node-async-decorators";

const myCachedAsyncFunc = cachefy(myAsyncFunc, { ttl: 1000, ...options });

cachefyObject(myInstance, "myInstanceAsyncMethod", { ttl: 1000, ...options }); //modifies 'myInstance'.

Use cachefy modifying the the default configuration.
import { cacheWithDefaultOptions } from "node-async-decorators";

const { cachefy, cachefyObject } = cacheWithDefaultOptions(
  { ttl: 1000 },
  ...defaultOptions
);

const myCachedAsyncFunc = cachefy(myAsyncFunc, { ttl: 1000 }, ...options);

cachefyObject(myInstance, "myInstanceAsyncMethod", { ttl: 1000 }, ...options); //modifies 'myInstance'.

By default, this is the default configuration.
import { CacheOptions } from "node-async-decorators";

const defaultOptions: CacheOptions = {
  /**
   * Time in milliseconds until the result will be expired
   */
  ttl: 1000,

  /**
   * Promises' storage.
   */
  storage: (): CacheStorage => {
    return new LocalCacheStorage();
  },

  /**
   * In case of an error couldn't be raised, this is the handler.
   */
  onError: (error: unknown) => {
    console.error(error);
  },

  /**
   * By default, all the parameters inputed in the original async function will be taken to identify a unique resquest.
   */
  context: (params: CacheInput): Context => {
    return params;
  },

  /**
   * Predefined function to generate the unique request identifier.
   */
  contextKey: (context: Context): Key => {
    return hash(context);
  },
};

Redis

Redis adapter using https://www.npmjs.com/package/redis version 4. Will use a redis DB to store the cached results. As redis is a shared DB, by default the RedisCacheStorage object will create a unique space identifier to isolate the decorated function/object. You can use your custom space id to share cache stored results between multiple functions/instances.


import { createClient } from "redis";
import {
  cacheWithDefaultOptions,
  RedisCacheStorage,
} from "node-async-decorators";

const redisClient = createClient({
  password: process.env.REDIS_PASSWORD || "",
  socket: {
    host: process.env.REDIS_HOST || "127.0.0.1",
    port: Number(process.env.REDIS_PORT) || 6379,
  },
});

await redisClient.connect();

const { cachefy, cachefyObject } = cacheWithDefaultOptions({
  ttl: 1000,
  storage: () =>
    new RedisCacheStorage({
      redisClient,
      spaceId, // use spaceId property to define a shared space
    }),
});

const myCachedAsyncFunc = cachefy(myAsyncFunc, { ttl: 1000 }, ...options);

cachefyObject(myInstance, "myInstanceAsyncMethod", { ttl: 1000 }, ...options); //modifies 'myInstance'.

await redisClient.disconnect();

(back to top)

Parallelify

Concurrency control is the pattern for avoiding to call the same process if another is still being executed. Next calls will be included in a queue an executed in order. The number of calls executed in parallel will be defined by a concurrency parameter. Each call will have a unique identifier generated by the context of the call. All the calls with the same context will be included in the same queue.


Example

import { parallelify } from "node-async-decorators";
const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};
const parallelSum = parallelify(sum, { concurrency: 1 });

parallelSum(3, 5).then((result) => {
  /*8*/
}); // call real sum
parallelSum(3, 5).then((result) => {
  /*8*/
}); // call real sum when the first call has finished
parallelSum(4, 7).then((result) => {
  /*11*/
}); // call real sum again because is other context
// ...
parallelSum(3, 5).then((result) => {
  /*8*/
}); // call real sum when the rest of the calls (3, 5) have been finished

Usage

Use parallelify directly with the default configuration.

import { parallelify, parallelifyObject } from "node-async-decorators";

const myParallelAsyncFunc = parallelify(myAsyncFunc, { concurrency: 1 });

parallelifyObject(myInstance, "myInstanceAsyncMethod", { concurrency: 1 }); //modifies 'myInstance'.

Use parallelify directly applying custom options.
import { parallelify, parallelifyObject } from "node-async-decorators";

const myParallelAsyncFunc = parallelify(myAsyncFunc, {
  concurrency: 1,
  ...options,
});

parallelifyObject(myInstance, "myInstanceAsyncMethod", {
  concurrency: 1,
  ...options,
}); //modifies 'myInstance'.

Use parallelify modifying the the default configuration.
import { parallelWithDefaultOptions } from "node-async-decorators";

const { parallelify, parallelifyObject } = parallelWithDefaultOptions(
  { concurrency: 1 },
  ...defaultOptions
);

const myParallelAsyncFunc = parallelify(
  myAsyncFunc,
  { concurrency: 1 },
  ...options
);

parallelifyObject(
  myInstance,
  "myInstanceAsyncMethod",
  { concurrency: 1 },
  ...options
); //modifies 'myInstance'.

By default, this is the default configuration.
import { ParallelOptions } from "node-async-decorators";

const defaultOptions: ParallelOptions = {
  /**
   * Number of parallel executions for the same context
   */
  concurrency: 1,

  /**
   * Promises' storage.
   */
  storage: (): TaskQueueRunnerStorage => {
    return new LocalTaskQueueRunnerStorage();
  },

  /**
   * In case of an error couldn't be raised, this is the handler.
   */
  onError: (error: unknown) => {
    console.error(error);
  },

  /**
   * By default, all the parameters inputed in the original async function will be taken to identify a unique resquest.
   */
  context: (params: ParallelInput): Context => {
    return params;
  },

  /**
   * Predefined function to generate the unique request identifier.
   */
  contextKey: (context: Context): Key => {
    return hash(context);
  },
};

(back to top)

Utils

Some utils derived from this library main functionalities.

Execute once

Will execute the async function once under the same context. Context is composed by an array. Target sum function parameters do not count for the context.

import { buildOnce } from "node-async-decorators";

const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};

const once = buildOnce();

once(() => sum(2, 5), ["context-1"]).then((result) => {
  /*7*/
}); // call real sum

once(() => sum(2, 5), ["context-1"]).then((result) => {
  /*7*/
}); // return the same result for the context  ["context-1"] without calling real sum function again

once(() => sum(2, 5), ["context-2"]).then((result) => {
  /*7*/
}); // call real sum again because the context is different ["context-2"]

Execute in parallel

Will execute an array of async tasks/functions. The number of tasks executed in parallel will be defined by a concurrency parameter. The result will be an array containing all the results in the same order of the tasks/functions. Similar to Promise.all result.

import { executeInParallel } from "node-async-decorators";

const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};

const concurrency = 2;

const tasks = [
  () => sum(1, 2), //First execution. Running tasks 1
  () => sum(2, 3), //Second execution. Running tasks 2
  () => sum(3, 4), //Third execution. Running tasks 2 (one of the last executions has finished)
  () => sum(4, 5), //Fourth execution. Running tasks 2 (one of the last executions has finished)
  () => sum(5, 6), //Fifth execution. Running tasks 2 (one of the last executions has finished)
];

const results = await executeInParallel(tasks, concurrency); // [3,5,7,9,11]

Execute in batch

Will execute an array of async tasks/functions. The array will be splitted into sub-arrays. The number of tasks in each sub-array is defined by parameters. All the tasks inside each sub-array will be executed all together. The next batch will start when the last has been finished. The result will be an array containing all the results in the same order of the tasks/functions. Similar to Promise.all result.

import { executeInBatch } from "node-async-decorators";

const sum = (number1: number, number2: number) => {
  return new Promise((resolve) => {
    resolve(number1 + number2);
  });
};

const batchSize = 2;

const tasks = [
  () => sum(1, 2), //First execution
  () => sum(2, 3), //First execution
  () => sum(3, 4), //Second execution (First batch has finished)
  () => sum(4, 5), //Second execution (First batch has finished)
  () => sum(5, 6), //Third execution (Second batch has finished)
];

const results = await executeInBatch(tasks, batchSize); // [3,5,7,9,11]

(back to top)

Contributing

Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!

  1. Fork the Project
  2. Create your Feature Branch (git checkout -b feature/AmazingFeature)
  3. Commit your Changes (git commit -m 'Add some AmazingFeature')
  4. Push to the Branch (git push origin feature/AmazingFeature)
  5. Open a Pull Request

(back to top)

License

Distributed under the MIT License. See LICENSE.txt for more information.

(back to top)

Acknowledgments

(back to top)

Package Sidebar

Install

npm i node-async-decorators

Weekly Downloads

0

Version

0.2.7

License

MIT

Unpacked Size

52 kB

Total Files

53

Last publish

Collaborators

  • asier_lopez