Nightmarish Pawnshop Mystic
    Share your code. npm Orgs help your team discover, share, and reuse code. Create a free org »



    A storage and developer workflow engine for enforcing arbitrary checks on ontologies of npm packages.


    The goal of the Warehouse is to support modular UI development by:

    • Guaranteeing new modules from npm publish conform to a set of Checks configurable by tag values in the package.json.
    • Serving fully built assets (e.g. JavaScript, CSS, etc) for published modules through a CDN target.
    • Decoupling upgrade and rollback of individual dependencies across multiple environments through use of npm dist-tag.
    • Automatically updating built assets when new versions of dependent modules are published.

    In other words the Warehouse is designed to give as many programmatic guarantees that it is safe to "always be on latest" and make rolling back as painless as possible when issues arise.

    Developer Experience

    The Warehouse was created with specific conventions around how developers release code:

    1. Front-end code is built to be modular by design.
    2. A module must be up-to-date with the latest version of its dependencies.
    3. Each module is released using npm publish
    4. Each module is released to production using npm dist-tag

    Releasing code

    Stability: 2 – Stable

    The release process for any module using the Warehouse is:

    1. Add the following publishConfig to your package.json
    "publishConfig": {
      "registry": ""
    1. Publish the module@version which releases it to your DEV environment.
    cd /path/to/my/front-end/module
    npm publish
    1. Perform any manual QA in your DEV environment.
    2. Promote the module@version to production using npm dist-tag add
    npm dist-tag add module@version prod

    NOTE In order to publish to warehouse you must add the following to your .npmrc. Authorization information is stubbed to let the npm client itself actually make the publish request and not just throw an error before it even tries.


    NOTE: You may also need to set strict-ssl to false if you do not configure SSL termination for

    npm c set strict-ssl false

    Rolling back to previous versions

    Stability: 2 – Stable

    The act of rolling back to a previous version takes two forms in the Warehouse:

    1. Rolling back a top-level module: if a module has no dependents (i.e. nothing depends on a given module) then that module is considered "top-level". In this case a rollback in a specific environment will use the previous build of the version being rolled back to with no other side-effects.
    2. Rolling back a module depended on by other modules: if a module has dependents (i.e. other modules depend on a given module) then rolling back to a previous version in a specific environment will trigger builds for all dependent modules.

    Rollback is performed using npm dist-tag. For example if my-module has a production version of 1.0.5:

    npm view my-module dist-tags
      latest: '1.0.5',
      devtest: '1.0.5',
      production: '1.0.5'

    And we wish to rollback production to 1.0.4 then:

    npm dist-tag add my-module@1.0.4 prod

    This will trigger a build of my-module@1.0.4 (since it's dependencies may have changed since 1.0.4 was latest in production) and a build of all modules that depend on my-module.

    Auto-update of builds

    Stability: 1 – Unstable

    The first (and most important) point developers need to be aware of is that builds from the Warehouse are always on latest version of private dependencies tagged with that particular environment.

    In other words the version specified in the package.json may not match the version used in builds, by design. For example if a module that has the following dependencies:

      "dependencies": {
        "private-dep-1": "1.0.x",
        "private-dep-2": "1.2.x",
        "private-dep-3": "~1.7.5",
        "public-dep-1": "1.0.x",
        "public-dep-2": "1.2.x"

    And the latest versions tagged with "production" and "devtest" are, respectively:



    Then the build the Warehouse returns for your module will include those dependencies.


    Stability: 2 – Stable

    Checks are always part of a suite of checks or "check suite". Checks are a way to ensure that a set of modules conform to a set of requirements programmatically. For example:

    • Must depend on react@0.13.x.
    • File specified for main in a package.json must exist in the package.
    • Jenkins build for master of this module must be passing and have the same git SHA.

    Writing a new Check

    Who are checks designed to be written by? You! Me! Developers! Checks are meant to be an accessible way to add more programmatic guarantees about modules being published.

    A Check is simply a function which is given a package buffer and a callback to execute when complete. For example, we could check to see if examples.js is defined on a given package.


    module.exports = function (buffer, next) {
      console.log(buffer.pkg);   // JSON parsed package.json contents.
      console.log(buffer.files); // Fully-read file contents for all files.
      if (!buffer.files['examples.js']) {
        return next(new Error('Missing required examples.js file'));

    Now that we've written our check we need to expose it in a check suite. We can do this in one of two ways:

    1. Add to an existing check suite: to add a check to an existing suite if one exists and simply add the check-*.js file there and publish it to expose your check.

    Checks, suites and package.json tags

    The relationship between checks, suites, and package.json tags is straight-forward but important to grasp to understand what checks will be run when you run npm publish for a given module.

    1. Every Check is part of a check suite.
    2. Check suites are executed on npm publish
    3. Tags on a module's package.json determine which check suite(s) will run.

    API documentation

    The Warehouse implements four distinct APIs over HTTP:

    • npm wire protocol: This is the HTTP API that the npm CLI client speaks. This allows the Warehouse to be a publish and install target for the npm CLI client itself. The wire protocol is implemented in two ways:

      • Overridden routes: These are routes that the warehouse itself has reimplemented to ensure that builds are fresh and that modules are installed from the correct environment.
      • npm proxying: before any 404 is served the request is first proxied over HTTP(S) to the location specified via
    • Assets & Builds: Creating ad-hoc builds, fetching builds and assets (based on fingerprint), and when necessary finding builds for a particular version or environment or both.

    • Checks: Lists checks that would run for a package, runs checks against packages ad-hoc, and gets stats about checks run against packages.

    • All routes are able to get some debugging information by using the ?debug=* query parameter. This will override the output of your request and show all logged output for that request as JSON, response headers that were intended to be sent back, and content that was sent back.

    npm wire protocol

    The following routes from the npm wire protocol are implemented:

    PUT    /:pkg                          # Publish a package
    GET    /:pkg                          # Install a package
    DELETE /:pkg/-rev/:rev                # Unpublish a package
    GET    /-/package/:pkg/dist-tags/     # Get all dist-tags
    PUT    /-/package/:pkg/dist-tags/     # Update a dist-tag
    POST   /-/package/:pkg/dist-tags/     # Set all dist-tags
    GET    /-/package/:pkg/dist-tags/:tag # Get a dist-tag
    PUT    /-/package/:pkg/dist-tags/:tag # Update a dist-tag
    POST   /-/package/:pkg/dist-tags/:tag # Set a dist-tag

    The rest of the requests related to the npm wire protocol will be sent to the npm read or write URL specified in the configuration

    Assets & Builds API

    GET  /assets/:hash                    # Fetch an actual build file, will detect Accept-Encoding: gzip
    GET  /builds/:pkg                     # Get build information
    GET  /builds/:pkg/:env/:version       # Get build information
    GET  /builds/:pkg/:env/:version/meta  # Get build information
    POST /builds/:pkg                     # Ad-hoc build
    POST /builds/compose                  # Trigger multiple builds

    Checks API

    GET  /checks/:pkg                     # Get all checks that would run
    POST /checks/:pkg/run                 # Run checks ad-hoc for a package
    GET  /checks/:pkg/stats               # Get stats for checks run

    Environment-specific installation

    Warehouse allows for installation against a specific dist-tag via the REGISTRY-ENVIRONMENT header. Although npm does not allow for headers to be set directly, [carpenter] sets these headers internally during it's install process.

    This is how multiple versions live and are built side-by-side in the same registry namespace. Without this nuance, the latest npm dist-tag would be installed by default everywhere, including carpenterd.

    Future extensions to this header-only API are planned:

    GET /env/:pkg  # Install a package against a specified "environment" (i.e. `dist-tag`)

    The purpose of this section is to document important internals, conventions and patterns used by the Warehouse.

    Data Models

    Currently the data models defined by the Warehouse are:

    • Build
    • BuildFile
    • BuildHead
    • Dependent
    • Version
    • Package

    They are documented individually in warehouse-models.

    Config options

      npm: {
        urls: {
          "auth-argument-factory": "/path/to/custom-auth.js",
          read: '',
          write: ''
        checkScript: 'path/to/a/forkable/script',
        cluster: {
          // Gjallarhorn related options
          gid: 0
          uid: 0
        // NpmVerifyStream related options
        concurrency: 5,
        cleanup: true,
        read: { log: /* a debug logger */ }


    Warehouse has the ability to use passport-npm to check authorization when connecting via npm. An example of this can be found in the tests for npm auth.

    Local development environment

    Running locally requires carpenterd to run locally or at the configured location as builds will run in carpenterd. Then the can be started using:

    npm start

    Running tests

    Running the tests will require a running cassandra instance on your local machine. All tests are written with mocha and istanbul. They can be run with npm:

    npm test
    LICENSE: MIT (C) 2015 Operating Company, LLC


    npm i

    Downloadsweekly downloads








    last publish


    • avatar
    • avatar
    • avatar