Core native node.js audio functionality, including sound card access and audio streaming
Right now, it's basically a node.js binding for PortAudio.
npm install node-core-audio
I am actively working on this, but if you want to see it happen faster, please send me an email!
Below is the most basic use of the audio engine. We create a new instance of node-core-audio, and then give it our processing function. The audio engine will call the audio callback whenever it needs an output buffer to send to the sound card.
// Create a new instance of node-core-audiovar coreAudio = require"node-core-audio";// Create a new audio enginevar engine = coreAudiocreateNewAudioEngine;// Add an audio processing callback// This function accepts an input buffer coming from the sound card,// and returns an ourput buffer to be sent to your speakers.//// Note: This function must return an output bufferconsole.log inputBufferlength + " channels" ;console.log "Channel 0 has " = inputBuffer0length + " samples" ;return inputBuffer;engineaddAudioCallback processAudio ;
// Alternatively, you can read/write samples to the sound card manually
var engine = coreAudiocreateNewAudioEngine;// Grab a buffervar buffer = engineread;// Silence the 0th channelfor var iSample=0; iSample<inputBuffer0length; ++iSamplebuffer0iSample = 0.0;// Send the buffer back to the sound cardenginewrite buffer ;
When you are writing code inside of your audio callback, you are operating on the processing thread of the application. This high priority environment means you should try to think about performance as much as possible. Allocations and other complex operations are possible, but dangerous.
IF YOU TAKE TOO LONG TO RETURN A BUFFER TO THE SOUND CARD, YOU WILL HAVE AUDIO DROPOUTS
The basic principle is that you should have everything ready to go before you enter the processing function. Buffers, objects, and functions should be created in a constructor or static function outside of the audio callback whenever possible. The examples in this readme are not necessarily good practice as far as performance is concerned.
The callback is only called if all buffers has been processed by the soundcard.
First things first
var coreAudio = require"node-core-audio";
Create and audio processing function
// Just print the value of the first sample on the left channelconsole.log inputBuffer00 ;
Initialize the audio engine and setup the processing loop
var engine = coreAudiocreateNewAudioEngine;engineaddAudioCallback processAudio ;
// Returns whether the audio engine is activebool engineisActive;// Updates the parameters and restarts the engine. All keys from getOptions() are available.enginesetOptionsinputChannels: 2;// Returns all parametersarray enginegetOptions;// Reads buffer of the input of the soundcard and returns as array.// Note: this is a blocking call, don't take too long!array engineread;// Writes the buffer to the output of the soundcard. Returns false if underflowed.// notic: blocking i/obool enginewritearray input;// Returns the name of a given devicestring enginegetDeviceName int inputDeviceIndex ;// Returns the total number of audio devicesint enginegetNumDevices;
MIT - See LICENSE file.
Copyright Mike Vegeto, 2013