Wondering what’s next for npm?Check out our public roadmap! »


    0.2.10 • Public • Published


    Build status NPM version Coverage Status Known Vulnerabilities

    Install with npm:

    $ npm install sastre --save

    Or yarn:

    $ yarn add sastre

    Quick start

    Please see the example file for a reference integration which includes:

    • Support for module-alias
    • Container interface for models:
      • User — Sequelize model
      • Token — ES6 class definition
    • Container interface for controllers :
      • UserController — GRPC controller

    Also, you can execute npm run test:example to run their unit-tests.

    How it works?

    The Resolver class will turn directories containing index.js files into object definitions.

    $ tree example/src/api/models
    ├── Token
    │   └── index.js
    └── User
        ├── classMethods
        │   └── add
        │       ├── index.js
        │       ├── index.test.js
        │       └── provider.js
        └── index.js
    4 directories, 5 files

    As result you'll get approximately the following module definition:

    const Token = require('./Token');
    const add = require('./User/classMethods/add');
    module.exports = {
      User: {
        classMethods: {


    • Index files are our entry points, even having tons of files, only indexes are taken into account to conform the object shape.
    • Entry points can be skipped on intermediate modules like classMethods.

    Nesting modules has no depth limits, you can produce almost any kind of object using this technique.

    Since 0.1.1 module and method names are converted from dash-case to PascalCase and camelCase respectively. This would help to alleviate issues with case-sensitiveness on some environments, especially when doing cross-environment development, e.g. Linux vs OSx.


    An instantiated Resolver becomes a container instance, you name it:

    const { Resolver } = require('sastre');
    const optionalHooks = {
      before(name, definition) {},
      after(name, definition) {},
    const sourcesDirectory = `${__dirname}/lib`;
    const container = new Resolver(null, sourcesDirectory, optionalHooks);

    You can access found modules through the get() method:

    const Test = container.get('Test');

    From this point you can start hacking happily.


    The Injector class is used to flag out any entry point that demands extra stuff.

    Any found provider.js files will be used to retrieve dependencies from the container:

    module.exports = {
      getAnything() { /* yes, it's empty; keep reading ;-) */ },
      getFromContainer() {
        // `this` is null here due `new Resolver(null, ...)`
        return this;


    • Function names are used to retrieve dependencies from the container.
    • It's OK if they're empty, but if you return any truthy value it'll be used in place.
    • Provider functions can access the context given on new Resolver() instantiation.

    Now, index.js MUST exports a function factory which will only receive the resolved dependencies.

    module.exports = ({ Anything }) =>
      function something() {
        return new Anything();

    Please use arrow-functions to inject values ase they're detected for this purpose, regular functions will not work.

    Any returned value is placed into the final module definition, as their lexical scope makes this DI pattern works.

    This would work for any kind of methods, even those from prototype.


    Modules resolved can be modified or replaced by custom decorators too.


    Once Resolver.scanFiles() is done, use this hook to modify or replace the original definition received.

    During this hook you can access the original class if given, without instantiate.


    Once a definition is unwrapped through container.get() and still unlocked, use this hook to modify or replace the final definition built.

    Here you can access the instantiated class if given, partially injected. In this step you can perform property-injection based on your needs.


    Why module-alias/register?

    The main goal of this library is getting rid of require calls from a well-know architecture do we use.

    We're also looking the benefits and pitfalls beyond splitting everything into submodules.

    But loading parent stuff for common fixtures it's still an issue, e.g. ../../../../

    Should be enough to require('@src/models/User/fixtures') isn't it?

    Why get<WHATEVER>?

    E.g. you want a Session object within a method.

    Using Session() { ... } is not enough clear, because Session() says nothing.

    Instead, getSession() { ... } helps to understand the purpose of the method... effectively.

    The get prefix is always stripped out, so you can use Session or getSession and the identifier will remain the same. This is useful if you want to get simple values instead, e.g. serviceLocator() { ... }

    How compose stuff?

    A nice way to build complex dependencies through provider.js files is:

    module.exports = {
      userRepo({ User, Repository, SequelizeDatasource }) {
        const dbUser = new SequelizeDatasource(User);
        const repo = new Repository(dbUser);
        return repo;
      getUser() {},
      getRepository() {},
      getSequelizeDatasource() {},

    And then you can inject them, e.g.

    module.exports = ({ userRepo }) =>
      async function getUsers() {
        return await userRepo.findAll();

    How inject classes?

    The included example implements an advanced container for higher IoC composition.

    ES6 classes are automatically decorated to receive all provided dependencies.

    You can access other containers through the root-container, it is given as this on all providers.

    What are root-providers?

    Individual provider.js files are intended to decorate in-place methods or whole containers.

    So, placing provider.js files in the same directory where modules are scanned is enough to serve as defaults, e.g.

    $ tree example/src/api/controllers
    ├── UserController
    │   └── index.js
    └── provider.js
    1 directory, 2 files

    What are chainables?

    The Chainable class lets you do things like this:

    const run = new Chainable(null, {
      async test() {
        return Promise.resolve(42);
      anythingElse() {
    await run($ => $.test.anythingElse());
    await run($ => $.anythingElse.test());
    await run(({ test }) => test.anythingElse());
    await run(({ anythingElse }) => anythingElse.test());

    This would help you to chain functions in order, both sync or async are supported.




    npm i sastre

    DownloadsWeekly Downloads






    Unpacked Size

    52.2 kB

    Total Files


    Last publish


    • avatar