adnn
adnn provides Javascript-native neural networks on top of general scalar/tensor reverse-mode automatic differentiation. You can use just the AD code, or the NN layer built on top of it. This architecture makes it easy to define big, complex numerical computations and compute derivatives w.r.t. their inputs/parameters. adnn also includes utilities for optimizing/training the parameters of such computations.
Examples
Scalar code
The simplest use case for adnn:
var ad = { var xdiff = adscalar; var ydiff = adscalar; return adscalar;} // Can use normal scalar inputsvar out = ;console; // 3.162... // Use 'lifted' inputs to track derivativesvar x1 = ad;var y1 = ad;var x2 = ad;var y2 = ad;var out = ;console; // still 3.162...out; // Compute derivatives of inputsconsole; // -0.316...
Tensor code
adnn also supports computations involving tensors, or a mixture of scalars and tensors:
var ad = ;var Tensor = ; { var sq = adtensor; return adtensor;} { return adscalar;} var vec1 = ad;var vec2 = ad;var out = ;console; // 3out;console; // [-0.66, 0.33, -0.66]
Simple neural network
adnn makes it easy to define simple, feedforward neural networks. Here's a basic multilayer perceptron that takes a feature vector as input and outputs class probabilities:
var Tensor = ;var ad = ;var nn = ;var opt = ; var nInputs = 20;var nHidden = 10;var nClasses = 5; // Definition using basic layersvar net = nn; // Alternate definition using 'nn.mlp' utilitynet = nn; // Train the parameters of the network from some dataset// 'loadData' is a stand-in for a user-provided function that// loads in an array of {input: , output: } objects// Here, 'input' is a feature vector, and 'output' is a class labelvar trainingData = ;opt; // Predict class probabilities for new, unseen featuresvar features = nInputs;var classProbs = net;
Convolutional neural network
adnn includes the building blocks necessary to create convolutional networks. Here is a simple example, adapted from a ConvNetJS example:
var nn = ; var net = nn;
Recurrent neural network
adnn is also flexible enough to support recurrent neural networks. Here's an example of a rudimentary RNN:
var ad = ;var nn = ; var inputSize = 10;var outputSize = 5;var stateSize = 20; // Component neural networks used by the RNNvar inputNet = nn;var stateNet = nn;var outputNet = nn;var initialStateNet = nn; { // Initialize hidden state var state = initialStateNet; // Process input sequence in order var outputs = ; for var i = 0; i < seqlength; i++ // Update hidden state state = adtensor // Generate output outputs; return outputs;}
ad
module
The The ad
module has its own documentation here
nn
module
The The nn
module has its own documentation here
opt
module
The The opt
module has its own documentation here
Tensors
adnn provides a Tensor
type for representing multidimensional arrays of numbers and various operations on them. This is the core datatype underlying neural net computations.
var Tensor = ; // Create a rank-1 tensor (i.e. a vector)var vec = 3; // vec is a 3-D vector// Fill vec with the contents of an arrayvec;// Return the contents of vec as an arrayvec; // returns [1, 2, 3]// Fill vec with a given valuevec; // vec is now [1, 1, 1]// Fill vec with random valuesvec;// Create a copy of vecvar dupvec = vec; // Create a rank-2 tensor (i.e. a matrix)var mat = 2 2; // mat is a 2x2 matrix// Fill mat with the contents of an arraymat;// Can also use a flattened arraymat;// Retrieve an individual element of matvar elem = mat; // elem = 2// Set an individual element of matmat; // mat is now [[1, 5], [3, 4]]
The Tensor
type also provides a large number of mathematical functions--unary operators, binary operators, reductions, matrix operations, etc. See tensor.js for a complete listing.
Projects using adnn
If you use adnn for anything, let us know and we'll list it here! Send email to daniel.c.ritchie@gmail.com
- Neurally-Guided Procedural Models: Learning to Guide Procedural Models with Deep Neural Networks
- WebPPL uses adnn as part of its variational inference implementation.