row-mapper
Efficiently move/map rows between PostgreSQL database tables.
Install
$ npm i row-mapper
Use case
Assume you have a table named users
and you want to move all it's rows
to another table named archived_users
- which could reside in another
database.
While you move the data you might also need to convert each row's data to another format.
Here's how you would do it:
'use strict' const RowMapper = const mapper = // The source database source: // DB settings client: 'pg' connection: host: '127.0.0.1' user: 'foo' password: 'bar_pass' database: 'foo_db' // The table to get data from. table: name: 'users' primaryKey: 'id_user' /* The target database, where data would be moved to. - This can be the same as the source DB. */ target: // DB settings client: 'pg' connection: host: '127.0.0.2' user: 'foo' password: 'foo_pass' database: 'bar_db' // The table to insert data to. table: name: 'archived_users' primaryKey: 'id_user' /* Name of the sequence associated with the primary key. - Upon succesful completion, this sequence will be set to the max value of the primary key column of the target table. */ primaryKeySequence: 'user_id_user_seq' // How many rows to process, per chunk. Defaults to 2000. chunkSize: 3000 /* This mapper function will run for *each* row of the source table. - You can convert each row here before you insert it to your target table. */ { return Object // Then just run the mappermapperstart
Features
-
You can convert huge tables. This module processes rows in chunks, so it avoids a lot of network roundtrips whilst keeping memory usage low since we it doesn't load the whole table in-memory.
-
Successfully inserted/mapped rows are saved in a cache. When you start processing again, the module will skip processed rows and continue from the last known succesful row.
Tests
$ npm test
Important: Tests are run against an in-memory SQLite database and do not reset the table PK sequences after they are done.
Authors
License
MIT