Support this package
Please support this package by starring it on Github
Stream To Mongo DB
stream-to-mongo-db
allows you to stream objects directly into a MongoDB databases, using a read stream (an a S3 file, local file, a Web API or even another MongoDB database). The best thing about this package is it allows you to control the size of the batch
before issuing a write to mongo - see CONFIG
SUPPORTED NODE VERSIONS
This package supports Node.js versions 8+
. If you require another version to be supported, please raise an issue.
USAGE
npm i stream-to-mongo-db
EXAMPLES
Example 1: Stream from another MongoDB database
MongoDB Client
Example 1.1: Usingconst MongoClient = MongoClient;const streamToMongoDB = streamToMongoDB; // where the data will come fromconst inputDBConfig = dbURL : 'mongodb://localhost:27017/yourInputDBHere' collection : 'yourCollectionHere' ;// where the data will end upconst outputDBConfig = dbURL : 'mongodb://localhost:27017/streamToMongoDB' collection : 'devTestOutput' ; MongoClient;
Mongoose
Example 1.2: Usingconst streamToMongoDB = streamToMongoDB;const mongoose = ; // where the data will come fromconst connection = mongoose;const MyModel = mongoose; // where the data will end upconst outputDBConfig = dbURL : 'mongodb://localhost:27017/streamToMongoDB' collection : 'devTestOutput' ; // create the writable streamconst writableStream = ; // create readable stream and consume itconst stream = MyModel; stream; stream;
This example gets even more powerful when you want to transform the input data before writing it to the writableStream:
... // create the readable stream and transform the data before writing itconst stream = MyModel; stream; stream;
AWS-SDK
Example 2: Stream from an S3 file usingconst streamToMongoDB = streamToMongoDB;const AWS = ;const JSONStream = ; const s3 = ;const params = Bucket: 'myBucket' Key: 'myJsonData.json' ; // where the data will end upconst outputDBConfig = dbURL : 'mongodb://localhost:27017/streamToMongoDB' collection : 'devTestOutput' ; // create the writable streamconst writableStream = ; // create readable stream and consume its3 ;
Example 3: Stream from a Web API
const streamToMongoDB = streamToMongoDB;const request = ;const JSONStream = ; // where the data will end upconst outputDBConfig = dbURL : 'mongodb://localhost:27017/streamToMongoDB' collection : 'devTestOutput' ; // create the writable streamconst writableStream = ; // create readable stream and consume it ;
Example 4: Stream from a local file
const streamToMongoDB = streamToMongoDB;const JSONStream = ;const fs = ; // where the data will end upconst outputDBConfig = dbURL: 'mongodb://localhost:27017/streamToMongoDB' collection: 'devTestOutput' ; // create the writable streamconst writableStream = ; // create readable stream and consume itfs ;
CONFIG
-
dbURL
[ REQUIRED - String ]
The url to your db (including the db name)
eg:
mongodb://localhost:27017/streamToMongoDB
-
collection
[ REQUIRED - String ]
The collection to stream to
eg:
myCollection
-
batchSize
[ OPTIONAL [ default :
1
] - Integer ]The number of documents consumed from the read stream before writing to mongodb
This option defaults to
1
, i.e: write every object individually to mongoDB as it is received. This default is ideal if want to ensure every object is written as soon as possible without the possibility of losing any objects if the MongoDB connection is interrupted.However, in most cases, this is unnecessary, since writing every object individually will incur an additional I/O cost. You can change this option to, say
100
, which will batch these writes in 100's; allowing you to consume the stream must faster.eg:
100
-
insertOptions
[ OPTIONAL [ default :
{ w : 1 }
] - Object ]MongoDB insert options
This option defaults to
{ w : 1 }
, i.e: requests acknowledgement that the write operation has propagated to the standalone mongod or the primary in a replica set
CONTRIBUTION
Please feel free to fork, pull request, discuss, share your ideas and raise issues. Any feedback is welcome!
ACKNOWLEDGEMENTS
Insipred by stream-to-mongo