Pass your audio context, and your DSP function to the constructor. It will return a Web Audio Script Processing Node, which can be connected to any other Web Audio Node on the graph.
Simple use, generate a tone:
var master = ;var jsynth = require'jsynth'tau = MathPI * 2frequency = 555;varreturn Mathsintime * tau * frequencyvar synth = jsynthmaster sineGenerator; // returns a web audio nodesynthconnectmasterdestination
Your function will be called with the following arguments:
- Time, in seconds (float)
- Sample index (integer)
- Input sample (float), MONO, this will be zero if there is no input. Use this if you are connecting other web Audio Api Nodes to this one. In the near future this will be a an array, for multiple input channels.
You function should return a float between [-1, 1]. See examples below.
npm install jsynth
On mobile safari webkit (iOS), you can only initiate web audio API sounds from within a user event context, such as a click.
to run the example, use opa
npm install -g watchify opa
(watchify is browserify + file a file watcher) Then:
git clone firstname.lastname@example.org:NHQ/jsynth.gitcd jsynthopa -n -e example.js
open your browser to http://localhost:11001