Stream data in and out of MySQL.
Linux pipes and Node.js streams are awesome for working with large amounts of data and especially for ETL (Extract, Transform, Load).
If you need to extract and/or insert data into MySQL, this should do the trick. It's super easy to write transformations in Node.js with transform streams.
Unstable. Not tested. We will stay on version 0.0.x till things are at least beta quality.
npm i -g mysqlio
mio-query SQL > some.csv mio-insert TABLE [DROP_CREATE] [INDEXES] < some.csv
For extended help, type the command name with no arguments.
mio-query will run the specified
SQL query and output the results to STDOUT.
SQL may be a file that contains SQL or it can be an actual SQL string like
'select Id from Account'.
mio-insert will insert CSV data from STDIN into the specified
names in the CSV must match exactly table field names.
DROP_CREATE is an optional path to a file of DDL statements to drop/create the
table. If specified this will be run before inserting the data.
INDEXES is an option path to a file of DDL statements to create indexes.
If specified this will be run after inserting the data, since it's generally
faster to insert data and then create indexes.
mysqlio looks for a file named
mysqlio-config.js in the current directory.
There are lots more options than are shown below. See https://github.com/mysqljs/mysql#connection-options for details.
We have both
dest because you often want to stream out of one DB
and into another.
exports = moduleexports =source:host: 'localhost'port: 3306user: 'root'password: 'password'database: 'mysqlio'dateStrings: truesupportBigNumbers: truedest:host: 'localhost'port: 3306user: 'root'password: 'password'database: 'mysqlio'dateStrings: truesupportBigNumbers: true;