@proak/dynamodb-stream-elasticsearch

2.0.4 • Public • Published

js-standard-style

             _                                   _  _     
          __| | _  _  _ _   __ _  _ __   ___  __| || |__  
         / _` || || || ' \ / _` || '  \ / _ \/ _` || '_ \
         \__,_| \_, ||_||_|\__,_||_|_|_|\___/\__,_||_.__/
                |__/  _                                                     
                  ___| |_  _ _  ___  __ _  _ __  
                 (_-<|  _|| '_|/ -_)/ _` || '  \ 
                 /__/ \__||_|  \___|\__,_||_|_|_|
           _            _    _                              _    
      ___ | | __ _  ___| |_ (_) __  ___ ___  __ _  _ _  __ | |_  
     / -_)| |/ _` |(_-<|  _|| |/ _|(_-</ -_)/ _` || '_|/ _|| ' \ 
     \___||_|\__,_|/__/ \__||_|\__|/__/\___|\__,_||_|  \__||_||_|
                                                             
                                                                            

DynamoDB --> Stream --> ElasticSearch

The missing blueprint for AWS Lambda, which reads stream from AWS DynamoDB and writes it to ElasticSearch. Compatible with node 8.10. (If for some reason you want to use it with node 6.10, then use 1.0.0 of this module)

Whenever data is changed (modified, removed or inserted) in DynamoDB one can use AWS Lambda function to capture this change and update ElasticSearch machine immediately. Further reading:

Indexing Amazon DynamoDB Content with Amazon Elasticsearch Service Using AWS Lambda

Getting Started

Install:

npm i dynamodb-stream-elasticsearch 

Use it in your lambda:

const { pushStream } = require('dynamodb-stream-elasticsearch');

const { ES_ENDPOINT, INDEX, TYPE } = process.env;

function myHandler(event, context, callback) {
  console.log('Received event:', JSON.stringify(event, null, 2));
  pushStream({ event, endpoint: ES_ENDPOINT, index: INDEX, type: TYPE })
    .then(() => {
      callback(null, `Successfully processed ${event.Records.length} records.`);
    })
    .catch((e) => {
      callback(`Error ${e}`, null);
    });
}

exports.handler = myHandler;

Upload Lambda to AWS and star this repository if it works as expected!!

Parameters

Param Description Required
event Event object generated by the stream (pass it as it is and don't modify) required
endpoint Exact url of ElasticSearch instance (it works with AWS ES and standard ES) (string) required
index The name of ElasticSearch index (string). If not provided will set the same as DynamoDB table name optional
type The type of the ElasticSearch document (string). If not provided will set the same as DynamoDB table name optional
refresh Force ElasticSearch refresh its index immediately more here (boolean). Default: true optional

Contributing and running the tests

To run tests locally you need to have ElasticSearch docker container. Simply type:

docker run -i -p 9200:9200 --name my_elastic -p 9300:9300 -e "discovery.type=single-node" elasticsearch

If you want to commit changes, make sure if follow these rules:

  1. All code changes should go with a proper integration test;
  2. Code should follow Javascript Standard Guideline;
  3. Commit messages should be set according to this article.

TODO

  • Introduce Continuous Integration;
  • Add elastic search bulk operation instead of index for multiple records;

Authors & Contributors

License

This project is licensed under the MIT License - see the LICENSE.md file for details

Donate

If you find this project to be useful and you would like to support the author for maintaining it, you might consider to make any donation under this link:

Donate via Paypal

Package Sidebar

Install

npm i @proak/dynamodb-stream-elasticsearch

Weekly Downloads

0

Version

2.0.4

License

MIT

Unpacked Size

10.8 kB

Total Files

7

Last publish

Collaborators

  • emasters