Nihilistic Pool Man


    3.2.35 • Public • Published


    This is our libary that wraps database calls for MSSQL. We support both (task-based) and (promise-based) calls. Task calls still work, but should be considered deprecated.

    It relies on four npm packages.

    1. mssql - used to make promise-based calls to SQL
    2. data.task - used to create a task oriented work flow
    3. ffErrorHandler - used for error handling and logging
    4. md5 - used to compare versions of the db config file (if you are using one for user and password)


    npm install @firstfleet/ffsql --save

    Configuration Variables

    Process Environment Variables (process.env)

    Key Description Default Value
    PAPERTRAIL_PROGRAM App name, used in logging 'UNKNOWN'
    PAPERTRAIL_HOST Host for syslog transport (using papertrail) undefined
    PAPERTRAIL_PORT Port for syslog transport (using papertrail) undefined
    NODE_ENV Node environment setting, used to control where we log, can see console logs if not in production undefined
    DB_CONFIG_FILE File path to your db configuration file. This file (MAY) hold the DB_USER and DB_PASSWORD. If you do not include a filepath, the library will look for DB_USER and DB_PASSWORD in the environment variables, rather than the file. undefined
    DB_SERVER Name of the sql server you wish to connect to undefined
    DB_USER Username of sql account used to access the DB. Not needed if using DB_CONFIG_FILE. undefined
    DB_PASSWORD Password of sql account used to access DB. Not needed if using DB_CONFIG_FILE. undefined
    DB_DATABASE Default DB to use when executing procs or queries. undefined
    DB_ENABLE_READ_ONLY Tells the library if you wish to try to use a read only connection when possible. If you wish to use this, see the Using Read Only seciton below false
    DB_REQ_TIMEOUT Request max timeout in ms 180000
    DB_POOL_MAX Max number of pools 50
    DB_READ_ONLY_REFRESH_INTERVAL How often to refresh read only cache in ms 1000 * 60 * 5
    APP_NAME Can be used as a fallback if PAPERTRAIL_PROGRAM is not present undefined
    NODE_HEARTBEAT_INTERVAL_MINUTES How often to log a heartbeat to the DB in mins undefined
    SlackPosterURL Slack webhook URL, will send errors here undefined


    There are two ways you can feed the sql account username and password to this library. The first is to simply set DB_USER and DB_PASSWORD in your process.env variables.

    The second is too instead, leave DB_USER and DB_PASSWORD undefined, and set DB_CONFIG_FILE instead. DB_CONFIG_FILE should be a file path to JSON file. Inside you should have a JSON object with two keys and values.

      "DB_USER": "SQLUsername",
      "DB_PASSWORD": "SQLPassword"

    But why would you want to store them in a text file, and not just use them in the process.env? This is because in some use cases, you may want to be abel to have process automatically rotate the password, and have the app magically swap it out without a restart, or any intervention.

    If you choose to use the DB_CONFIG_FILE, the library will automatically monitor the file for changes, and update the password for the sql connection if a change is detected.

    Using Read Only

    To use read only intents, and have the library automatically call with a read only intent, you need to create a procedure in your default database DB_DATABASE called RO_GetProcs. This procedure needs to return rows with only one column. That column must have a name of name and in that column should be your read only procedures.

    Example of RO_GetProcs:

    CREATE PROCEDURE [dbo].[RO_GetProcs] AS
    SELECT LOWER(dbname + '.dbo.' + ProcName) name
    FROM ReadOnlyProcs
    WHERE Active = 1

    This will give you a list of proces you wish to be executed in a read only context, and the library will attempt any procs in that list as read only, and will fall back to the regular intent if it fails.

    Notes on UTC and Dates

    By default, the mssql library assumes any passed-in javascript date objects are in UTC. Because of this, you may find that dates inserted into the database are automatically shifted ahead to UTC. The easiest way to fix this is to pass in the option {useUTC: false} in your query.

    This issue can be further complicated when calling the ExecMsBulkInsert method because it takes the extra step of querying the destination table to know the column types. When it does this and detects a destination column as datetime, the library will aggressively attempt to convert even strings to javscript objects, and the useUTC option is effectively ignored for strings (it does work fine with date objects). We have logic that alleviates this issue by detecting when the data is a string, which will then treat the destination as if it were a simple varchar column and skip the date conversion. SQL will then handle converting the string to a date, and it does to so without UTC conversion. This logic depends on the call to createBulkTableObject containing the first row of data for it to determine if it needs to override a column's date type.


    ExecMsProc (procedure, params, options)

    • procedure: name of SQL stored procedure, can include full db path if not in default DB_DATABASE.
    • params: object containing key/value options that must match the parameters of the procedure
    • options
      • firstOnly - if set to true, the method will only return the first record found (instead of an array)
    Special notes
    • If your query result is a single value, it will automatically be unpacked to a simple scaler value (instead of an array)
    const sql = require('@firstfleet/ffsql')
    sql.ExecMsProc('ESig_GetFieldsToSave', {DocID: docId, SignSequence: signSequence}, {firstOnly: true})
    sql.ExecMsProc('ESig_GetEmployeeDocs', {empId: empId})

    ExecMsQuery (queryText, params, options)

    • queryText: Any SQL text. Param placeholders should be in the string prefixed with @ followed by an incrementing integer (similar to .NET PetaPoco syntax)
    • params: an array of parameters that will slide into the queryText placeholders
    • options
      • firstOnly - if set to true, the method will only return the first record found (instead of an array)
    Special notes
    • If your query result is a single value, it will automatically be unpacked to a simple scaler value (instead of an array)
    const sql = require('@firstfleet/ffsql')
    sql.ExecMsQuery(`select count(*) from ESigEmpDocs where EmpId = @0`, [empId], {firstOnly: true});
    sql.ExecMsQuery('update ESigEmpDocs set SignDate = getdate(), SignStatus = @0 where Id = @1', [signStatus, empDocId])

    ExecMsBulkInsert (dbName, tableName, columns, data, options)


    • dbName: Name of the database where the table that you wish to insert into is located.
    • tableName: Name of the table you wish to insert into.
    • columns: array of column names (in order) that data should be inserted into.
    • data: Array of arrays, where each nested array is the values you want to insert, representing a row
    • options: {purgeTableFirst: true/false}. Flag to truncate the table before insert (true === truncate)


    • This method makes use of SQL transactions. If the bulk insert fails, it will rollback the transaction (undoing any potential table purging option).
    • In case of failure, the process will iterate over the data (again with individual transactions) to attempt to find the first data row causing the failure, which will be posted to Slack.
    • The mssql library tends to be very strict on data types and will throw errors if it thinks the dataset does not match the destination column type. We have custom logic to try and mitigate this by automatically converting between strings/numbers/dates ahead of time.

    Example'/raddecs', (req, res) => {
        try {
            let data = req.body;
            if (data) {
                if (data.packets && data.packets.length) {
                    const payload = advlib.process(data.packets, PROCESSORS)
                    // One "row" of data, where each key is column name
                    const sqlRow = {
                        hexString: data.packets.join('|'),
                        macAddress: data.transmitterId,
                        beaconType: '',
                        txPower: _getTxPower(payload, data),
                        loggedAtUtc: moment.utc().format()
                    // Another "row" of data, again, each key is a column name
                    const AnothersqlRow = {
                        hexString: data.packets.join('|'),
                        macAddress: data.transmitterId,
                        beaconType: '',
                        txPower: _getTxPower(payload, data),
                        loggedAtUtc: moment.utc().format()
                    sql.ExecMsBulkInsert('FFObjects', 'bleData', Object.keys(sqlRow), [sqlRow, AnothersqlRow])
                        .then(() => {
    // End the request
        } catch (error) {

    createBulkTableObject (dbName, tableName, columns, firstDataRow)


    • dbName: Name of the database that holds the table that will be used for bulk insertion
    • tableName: Name of the table you are going to bulk insert into
    • columns: Array where each item in the array is a column name (in order) data should be inserted into
    • firstDataRow: (optional) First of row of data, used for column type verification


    // This creates a bulk table object we can reuse, rather than having to create it each time
    bulkTableConfig = await sql.createBulkTableObject('FFObjects', 'PS_hosLogStaging',
        ['PSMessageId', 'LogId', 'LogGUID', 'DriverId', 'TractorId', 'Activity', 'StartTime', 'Duration', 'Edit', 'State', 'Odometer',
            'Distance', 'Location', 'EditReason', 'EventType', 'OriginalLogId', 'TrailerId', 'CoDriver', 'EventTimeTerminal', 'TimezoneTerminal', 'UTCOffset'])
    // Now, you simply have to feed it the data
    const logsToInsert =
        .map(event => (
                LogGUID: event.guid,
                DriverId: data.user.external_id,
                TractorId: event.power_unit_number,
                Activity: event.duty_status ? : '',
                StartTime: time.formatUtc(event.event_timestamp_utc),
                Duration: event.duty_status ? event.duty_status.duration_seconds : 0,
                Edit: event.edit_count === 0 ? 0 : 1,
                State: event.location_state_code,
                Odometer: event.odometer_total_decimal,
                Distance: event.odometer_distance,
                Location: event.location_description,
                EditReason: event.edit_reason,
                EventType: HOSEventLookup.getHosEventType(event.event_type, event.event_code),
                OriginalLogId: event.origin_guid,
                TrailerId: data.attributes.trailer_numbers.length > 0 ? data.attributes.trailer_numbers.join(',') : undefined,
                CoDriver: data.attributes.co_drivers.length > 0 ? parseCoDrivers(data.attributes.co_drivers) : undefined,
                EventTimeTerminal: time.formatLocalTimestamp(event.event_timestamp_terminal),
                TimezoneTerminal: event.event_timezone,
                UTCOffset: event.event_utc_offset
    await sql.ExecMsBulkInsert('', '', '', logsToInsert, {tableConfig: bulkTableConfig});


    Each of the exposed module methods return Task from data.task. You need to fork the task in order to get you Task.rejected(error) or your Task.of(data)

    Require the module
    cosnt sql = require('@firstfleet/ffsql')


    sql.ReturnSqlResults(procedure, params)

    Takes in a prodcedure name {string} and params {Object}. Returns a Task.rejected(error) || Task.of(data)


    sql.ReturnSqlResults("Database.dbo.getSqlRecords", {user: userId, option: 1})


    sql.ExecSql(procedure, params)

    Takes in a procedure name {string} and a params {Object}. Returns either a Task.rejected(error) || Task.of('Success') Used to execute a sql procedure when not expecting any data to return.


    sql.ExecDynamicSql(procdedure || sql string, params)

    Takes in either a sql procedure, or a string of sql text like "select * from table". Returns either a Task.rejected( error) || Task.of(data)




    npm i @firstfleet/ffsql

    DownloadsWeekly Downloads






    Unpacked Size

    312 kB

    Total Files


    Last publish


    • dwhitak
    • spies36
    • brbeaird
    • grantdaddy
    • jxpatto