pg-ez - node-postgres made easy
node-postgres is not particularly difficult to use and has a well-documented API, but as that documentation states, "node-postgres strives to be low level an un-opinionated." pg-ez, on the other hand, strives to be high-level and (more) opinionated, allowing you to get up and querying within seconds rather than minutes. It uses best practices as recommend by node-postgres, so you don't have to concern yourself with things like releasing clients back to a pool.
Installation
npm install pg-ez
Testing
npm test
The tests test pg-ez
's integration with node-postgres
but run quickly. The tests require you to have defined environment variables for PGUSER
, PGPASSWORD
, PGHOST
, PGPORT
, and PGDATABASE
.
Documentation
Establishing a database connection
Requiring pg-ez
and establishing a database connection is done in a single line. Like pg
, you can pass to it a connection string or a connection object; if you pass neither, pg-ez
will, like pg
, try to establish a connection using environment variables.
Example 1: passing in a connection string
const db = 'postgresql://admin:sekrit@localhost:5432/mydb';
Example 2: passing in a connection object
const db = user: 'admin' password: 'sekrit' host: 'localhost' port: 5432 database: 'mydb';
Example 3: passing in nothing
// NOTE: requires that there are defined environment variables for PGUSER, PGPASSWORD, PGHOST, PGPORT, and PGDATABASEconst db = ;
Queries
Querying in pg-ez
is nearly the same as querying in pg
: simply call the exec
method and pass to it a query string and parameters, or pass to it a query configuration object. Like pg
, pg-ez
supports 3 flavors of asynchronous querying: async
/ await
, promises, and callbacks.
async
/ await
Example 1: using async { try const result = await db; console; catch err console; };
Example 2: using promises
db ;
Example 3: using callbacks
db;
Streams
Big data can bring big problems. If you have a query yielding millions of rows, you probably don't want to put the query results into memory and thereby spike your memory usage. Streams to the rescue! The stream
method returns a native promise, not a stream; however, this particular promise supports a pipe
method, allowing you to pass data through and chain together pipes just as though you were dealing with a stream. An error thrown at any point in the pipeline will propagate and can be caught—as any promise error can be—with a catch
method (if using promises) or a try
/ catch
block (if using async
/ await
).
http
response
Example 1: stream JSON transform of results to const JSONStream = ;const http = ;http;
Example 2: stream comma-delimited transform of results to CSV file
const csvStream = headers: 'x' 'y';const fs = ;const fileStream = fs; db ;
Transactions
Transactions are implemented intuitively: simply wrap all your desired statements within a transaction
"block." The transaction
method returns a native promise, so you can do follow-up processing with then()
, or you can use await
if your transaction
invocation is inside an async
function. An error in any query within the transaction block will automatically trigger a rollback, but because transaction
returns a promise, you can catch
the error to perform additional error handling.
async
/ await
Example 1: Using async { try await db; console; catch err console; };
Example 2: Using promises
db;