Concurrently process batch records with partial failure support.
[!WARNING] This is an ES only package. Before installing, make sure that your project's configuration supports ECMAScript modules.
pnpm add @driimus/lambda-batch-processor
For types to work as expected, @types/aws-lambda
must be installed:
pnpm add --save-dev @types/aws-lambda
[!WARNING]
ReportBatchItemFailures
must be enabled to allow retrying failed messages.
import { SQSBatchProcessor } from '@driimus/lambda-batch-processor';
const processor = new SQSBatchProcessor(async (record) => {
/** do stuff */
});
export const handler = processor.process;
Supported event sources:
-
DynamoDB Streams
import { DynamoDBBatchProcessor } from '@driimus/lambda-batch-processor';
-
Kinesis Data Streams
import { KinesisBatchProcessor } from '@driimus/lambda-batch-processor';
-
SQS
import { SQSBatchProcessor } from '@driimus/lambda-batch-processor';
Exceptions that occur during batch processing can be treated as permanent failures.
This feature is inspired from the AWS Lambda Powertools for Java, with one key difference:
By default, messages that trigger permanent failures will not be reported.
In the case of SQS messages, the result will be their deletion from the queue.
To send SQS messages to a dead-letter queue, you can use @driimus/sqs-permanent-failure-dlq
.
Logging can be enabled by providing a logger compatible with the Logger
interface,
which is modelled after pino's function signatures.
[!NOTE] The provided logger should support serialising AggregateError objects.
Don't need special handling for non-retryable errors? Make sure to check out the implementation in AWS Lambda Powertools for Typescript, which has added ESM support in v2.