node-neural-network

    1.0.3 • Public • Published

    node-neural-network Build Status

    Node-neural-network is a javascript neural network library for node.js and the browser, its generalized algorithm is architecture-free, so you can build and train basically any type of first order or even second order neural network architectures. It's based on Synaptic.

    This library includes a few built-in architectures like multilayer perceptrons, multilayer long-short term memory networks (LSTM), liquid state machines or Hopfield networks, and a trainer capable of training any given network, which includes built-in training tasks/tests like solving an XOR, completing a Distracted Sequence Recall task or an Embedded Reber Grammar test, so you can easily test and compare the performance of different architectures.

    The algorithm implemented by this library has been taken from Derek D. Monner's paper:

    A generalized LSTM-like training algorithm for second-order recurrent neural networks

    There are references to the equations in that paper commented through the source code.

    Introduction

    If you have no prior knowledge about Neural Networks, you should start by reading this guide.

    Demos

    The source code of these demos can be found in this branch.

    Getting started

    Overview

    Installation

    In node

    You can install lucasBertola/node-neural-network with npm:

    npm install node-neural-network --save
    In the browser

    Just include the file NodeNeuralNetwork.min.js from /dist directory with a script tag in your HTML:

    <script src="NodeNeuralNetwork.min.js"></script>

    Usage

    var NodeNeuralNetwork = require('node-neural-network'); // this line is not needed in the browser
    var Neuron = NodeNeuralNetwork.Neuron,
        Layer = NodeNeuralNetwork.Layer,
        Network = NodeNeuralNetwork.Network,
        Trainer = NodeNeuralNetwork.Trainer,
        Architect = NodeNeuralNetwork.Architect;

    Now you can start to create networks, train them, or use built-in networks from the Architect.

    Gulp Tasks

    • gulp: runs all the tests and builds the minified and unminified bundles into /dist.
    • gulp build: builds the bundle: /dist/NodeNeuralNetwork.js.
    • gulp min: builds the minified bundle: /dist/NodeNeuralNetwork.min.js.
    • gulp debug: builds the bundle /dist/NodeNeuralNetwork.js with sourcemaps.
    • gulp dev: same as gulp debug, but watches the source files and rebuilds when any change is detected.
    • gulp test: runs all the tests.

    Examples

    Perceptron

    This is how you can create a simple perceptron:

    perceptron.

    function Perceptron(input, hidden, output)
    {
        // create the layers
        var inputLayer = new Layer(input);
        var hiddenLayer = new Layer(hidden);
        var outputLayer = new Layer(output);
     
        // connect the layers
        inputLayer.project(hiddenLayer);
        hiddenLayer.project(outputLayer);
     
        // set the layers
        this.set({
            input: inputLayer,
            hidden: [hiddenLayer],
            output: outputLayer
        });
    }
     
    // extend the prototype chain
    Perceptron.prototype = new Network();
    Perceptron.prototype.constructor = Perceptron;

    Now you can test your new network by creating a trainer and teaching the perceptron to learn an XOR

    var myPerceptron = new Perceptron(2,3,1);
    var myTrainer = new Trainer(myPerceptron);
     
    myTrainer.XOR(); // { error: 0.004998819355993572, iterations: 21871, time: 356 }
     
    myPerceptron.activate([0,0]); // 0.0268581547421616
    myPerceptron.activate([1,0]); // 0.9829673642853368
    myPerceptron.activate([0,1]); // 0.9831714267395621
    myPerceptron.activate([1,1]); // 0.02128894618097928
    Long Short-Term Memory

    This is how you can create a simple long short-term memory network with input gate, forget gate, output gate, and peephole connections:

    long short-term memory

    function LSTM(input, blocks, output)
    {
        // create the layers
        var inputLayer = new Layer(input);
        var inputGate = new Layer(blocks);
        var forgetGate = new Layer(blocks);
        var memoryCell = new Layer(blocks);
        var outputGate = new Layer(blocks);
        var outputLayer = new Layer(output);
     
        // connections from input layer
        var input = inputLayer.project(memoryCell);
        inputLayer.project(inputGate);
        inputLayer.project(forgetGate);
        inputLayer.project(outputGate);
     
        // connections from memory cell
        var output = memoryCell.project(outputLayer);
     
        // self-connection
        var self = memoryCell.project(memoryCell);
     
        // peepholes
        memoryCell.project(inputGate);
        memoryCell.project(forgetGate);
        memoryCell.project(outputGate);
     
        // gates
        inputGate.gate(input, Layer.gateType.INPUT);
        forgetGate.gate(self, Layer.gateType.ONE_TO_ONE);
        outputGate.gate(output, Layer.gateType.OUTPUT);
     
        // input to output direct connection
        inputLayer.project(outputLayer);
     
        // set the layers of the neural network
        this.set({
            input: inputLayer,
            hidden: [inputGate, forgetGate, memoryCell, outputGate],
            output: outputLayer
        });
    }
     
    // extend the prototype chain
    LSTM.prototype = new Network();
    LSTM.prototype.constructor = LSTM;

    These are examples for explanatory purposes, the Architect already includes Multilayer Perceptrons and Multilayer LSTM network architectures.

    Contribute

    node-neural-network is an Open Source project. Anybody in the world is welcome to contribute to the development of the project.

    If you want to contribute feel free to send PR's, just make sure to run the default gulp task before submiting it. This way you'll run all the test specs and build the web distribution files.

    <3

    Install

    npm i node-neural-network

    DownloadsWeekly Downloads

    5

    Version

    1.0.3

    License

    MIT

    Last publish

    Collaborators

    • lucasbertola