Extend and supercharge your DynamoDB DocumentClient with promises, retries, and more.
API Documentation (minus the callbacks, of course)
Installation
npm i dynamo-plus
const { DynamoPlus } = require('dynamo-plus')
const documentClient = DynamoPlus({
region: 'eu-west-1',
})
const regularDynamoParams = {
TableName: 'myTable',
Key: {
myKey: '1337'
}
}
const data = await documentClient.get(regularDynamoParams)
Features
- automatically appends .promise()
- automatically enables HTTP keep-alive
- improves stack traces to help identify user errors
- retries and backs off when you get throttled
- new method for performing batchGet requests in chunks
- new methods for performing batchWrite requests in chunks
- new methods for query operations
- new methods for scan operations
Promises by default
The DynamoPlus client will automatically append .promise()
for you, making all methods await
able by default.
When the client is instantiated, the original methods are prefixed and accessible through e.g. original_${method}
HTTP keep-alive by default
Setting up TCP connections is slow and costly, especially when you need to perform multiple operations in a row. Enabling HTTP keep-alive can reduce latency by up to 70%, but the v2 SDK doesn't enable it by default. DynamoPlus takes care of the boilerplate for you.
Retries and backoff
Whenever a query fails for reasons such as LimitExceededException
the promise will reboot behind the scenes so that you don't have to worry about it.
For information about retryable exceptions, see https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Programming.Errors.html#Programming.Errors.MessagesAndCodes
If you want to use a delay from the beginning, set lastBackOff
to a millisecond value in the query params.
New method for performing batchGet requests in chunks
The built-in batchRead method can be used for fetching multiple documents by their primary keys, but it requires you to handle chunking and unprocessed items yourself. DynamoPlus adds a getAll method that does the heavy lifting for you.
getAll(params)
It's like batchGet, but with the simple syntax of get.
-
params
Object
- TableName
-
Keys - An array of key objects equivalent to
Key
in get(). - BatchSize - Optional custom batch size. Defaults to 100 which the maximum permitted value by DynamoDB.
getAll() returns
const params = {
TableName: 'users',
Keys: [{ userId: '1' }, { userId: '2' }, /* ... */ { userId: '999' }]
}
const response = await documentClient.getAll(params)
// response now contains ALL documents, not just the first 100
New methods for performing batchWrite requests in chunks
batchWrite is neat for inserting multiple documents at once, but it requires you to handle chunking and unprocessed items yourself, while also using it's own somewhat unique syntax. We've added deleteAll() and putAll() to do the heavy lifting for you.
deleteAll(params)
batchWrite deletions, but with the simple syntax of delete.
-
params
Object
- TableName
-
Keys - An array of key objects equivalent to
Key
in delete(). - BatchSize - Optional custom batch size. Defaults to 25 which the maximum permitted value by DynamoDB.
deleteAll() does not return any data once it resolves.
const params = {
TableName: 'Woop woop!',
Keys: [{ userId: '123' }, { userId: 'abc' }]
}
await documentClient.deleteAll(params)
putAll(params)
batchWrite upserts, but with the simple syntax of put.
-
params
Object
- TableName
-
Items - An array of documents equivalent to
Item
in put(). - BatchSize - Optional custom batch size. Defaults to 25 which the maximum permitted value by DynamoDB.
putAll() does not return any data once it resolves.
const params = {
TableName: 'Woop woop!',
Items: [{ a: 'b' }, { c: 'd' }]
}
await documentClient.putAll(params)
New methods for query()
Query has new sibling methods that automatically paginate through resultsets for you.
queryAll(params)
Resolves with the entire array of matching items.
const params = {
TableName : 'items',
IndexName: 'articleNo-index',
KeyConditionExpression: 'articleNo = :val',
ExpressionAttributeValues: { ':val': articleNo }
}
const response = await documentClient.queryAll(params)
// response now contains ALL items with the articleNo, not just the first 1MB
queryStream(params)
Like scanStream, but for queries.
queryStreamSync(params)
Like scanStreamSync, but for queries.
New methods for scan()
We've supercharged scan() for those times when you want to recurse through entire tables.
scanAll(params)
Resolves with the entire array of matching items.
const params = {
TableName : 'MyTable',
FilterExpression : 'Year = :this_year',
ExpressionAttributeValues : {':this_year' : 2015}
}
const response = await documentClient.scanAll(params)
// response now contains ALL documents from 2015, not just the first 1MB
scanStream(params[, parallelScans])
An EventEmitter-driven approach to recursing your tables. This is a powerful tool when you have datasets that are too large to keep in memory all at once.
To spread out the workload across your table partitions you can define a number of parallelScans
. DynamoPlus will automatically keep track of the queries and emit a single done
event once they all complete.
Note: scanStream() does not care whether your event listeners finish before it requests the next batch. (It will, however, respect throttling exceptions from DynamoDB.) If you want to control the pace, see scanStreamSync.
- params - AWS.DynamoDB.DocumentClient.scan() parameters
- parallelScans - integer|array (Default: 1) Amount of segments to split the scan operation into. It also accepts an array of individual segment options such as LastEvaluatedKey, the length of the array then decides the amount of segments.
The returned EventEmitter emits the following events:
-
data
- Raw response from each scan -
items
- An array with documents -
done
- Emitted once there are no more documents scan error
const params = {
TableName : 'MyTable'
}
const emitter = documentClient.scanStream(params)
emitter.on('items', async (items) => {
console.log(items)
})
scanStreamSync(params[, parallelScans])
Like scanStream()
, but will not proceed to request the next batch until all eventlisteners have returned a value (or resolved, if they return a Promise).
- params - AWS.DynamoDB.DocumentClient.scan() parameters
- parallelScans - integer|array (Default: 1) Amount of segments to split the scan operation into. It also accepts an array of individual segment options such as LastEvaluatedKey, the length of the array then decides the amount of segments.
The returned EventEmitter emits the following events:
-
data
- Raw response from each scan -
items
- An array with documents -
done
- Emitted once there are no more documents scan error
const params = {
TableName : 'MyTable'
}
const emitter = documentClient.scanStreamSync(params)
emitter.on('items', async (items) => {
// Do something async with the documents
return Promise.all(items.map((item) => sendItemToSantaClaus(item)))
// Once the Promise.all resolves, scanStreamSync() will automatically request the next batch.
})
FAQ
aws-sdk
module isn't installed.
I'm getting errors that the aws-sdk
is set as a dev-dependency since it is pretty large and installed by default on AWS Lambda.
I need to use the regular client methods for some edge case.
They are all available with an original_
prefix:
const { DynamoPlus } = require('dynamo-plus')
const documentClient = DynamoPlus()
documentClient.original_get(myParams, (err, data) => {})
// or
documentClient.original_get(myParams).promise()
Automatic retries don't apply when calling original methods directly.
None of these questions seem to be questions.
That's a statement, but I see your point.