Run trained deep neural networks in the browser or node.js. Currently supports serialization from trained Keras models.
Training deep neural networks on any meaningful dataset requires massive computational resources and lots and lots of time. However, the forward pass prediction phase is relatively cheap - typically there is no backpropagation, computational graphs, loss functions, or optimization algorithms to worry about.
What do you do when you have a trained deep neural network and now wish to use it to power a part of your client-facing web application? Traditionally, you would deploy your model on a server and call it from your web application through an API. But what if you can deploy it in the browser alongside the rest of your webapp? Computation would be offloaded entirely to your end-user!
Perhaps most users will not be able to run billion-parameter networks in their browsers quite yet, but smaller networks are certainly within the realm of possibility.
You can also run the examples on your local machine at
$ npm run examples-server
See the source code of the examples above. In particular, the CIFAR-10 example demonstrates a multi-threaded implementation using Web Workers.
In the browser:
$ npm install neocortex-js
The core steps involve:
- Instantiate neural network class
let nn =// relative URL in browser/webworker, absolute path in node.jsmodelFilePath: 'model.json'arrayType: 'float64' // float64 or float32;
- Load the model JSON file, then once loaded, feed input data into neural network
To build the project yourself, for both the browser (outputs to
build/neocortex.min.js) and node.js (outputs to
$ npm run build
To build just for the browser:
$ npm run build-browser
A script to serialize a trained Keras model together with its
hdf5 formatted weights is located in the
utils/ folder here. It currently only supports sequential models with layers in the API section below. Implementation of graph models is planned.
Functions and layers currently implemented are listed below. More forthcoming.
Advanced activation layers
rGRULayer(gated-recurrent unit or GRU)
rLSTMLayer(long short-term memory or LSTM)
rJZS3Layer(mutated GRUs - JZS1, JZS2, JZS3 - from Jozefowicz et al. 2015)
embeddingLayer- maps indices to corresponding embedding vectors
batchNormalizationLayer- see Ioffe and Szegedy 2015
implement merge and graph structures from keras
implement additional keras layers such as TimeDistributedDense, etc.
$ npm test
Browser testing is planned.
Thanks to @halmos for the logo.