elasticsearch-batch-stream

1.1.5 • Public • Published

elasticsearch-batch-stream

Travis Codacy Badge Known Vulnerabilities Coverage Status Greenkeeper badge js-standard-style

A write stream that creates batches of elasticsearch bulk operations.

Example

The ElasticSearch library has a function to bulk write documents, but since a stream emits a write for each document, we cannot group multiple operations together.

This package wraps the bulk function in a writestream to help buffer the operations and passing them on as batches to the bulk function. For example, we can now create batches of 500 docs each and reduce the number of API calls to ElasticSearch from 100.000 to 200, which will improve speed.

  const docTransformStream = through2.obj(function (chunk, enc, callback) {
    // convert chunk => doc
    const doc = { index: 'myindex', type: 'mytype', id: '12345', action: 'index', doc: { name: 'test' } }
    callback(null, doc)
  })
 
  sourceReadStream().pipe(docTransformStream()).pipe(bulkWriteStream({ client, size: 500 }))

Installation

$ npm install elasticsearch-batch-stream

API

bulkWriteStream(options = { client, size })

Creates the write stream to ElasticSearch.

options

The options object argument is required and should at least include the ElasticSearch client object.

client

An instance of the ElasticSearch client i.e. new elasticsearch.Client()

size

Number of stream operations to group together in the bulk command (default = 100).

Maintainers

Osmond van Hemert Github Web

Contributing

If you would like to help out with some code, check the details.

Not a coder, but still want to support? Have a look at the options available to donate.

License

Licensed under MIT.

Package Sidebar

Install

npm i elasticsearch-batch-stream

Weekly Downloads

2

Version

1.1.5

License

MIT

Unpacked Size

20.8 kB

Total Files

11

Last publish

Collaborators

  • ovhemert