Render waveforms to <canvas />
using WebGPU
Visit https://aykev.dev/webgpu-waveform/ for examples
This package is distributed for both usage with ESM and UMD. It includes TypeScript definition files too. Install from the npm registry:
npm i webgpu-waveform
For usage with React, check out the webgpu-waveform-react package.
The class GPUWaveformRenderer
is initialized using the static method .create(...)
. It has the following definition:
static async create(
canvas: HTMLCanvasElement,
channelData: Float32Array
): GPUWaveformRenderer
It takes in the following arguments:
-
canvas: HTMLCanvasElement
— the canvas element to render to -
channelData: Float32Array
— the array of PCM samples to render
Example:
async function example(canvas, audioBuffer) {
const channelData = audioBuffer.getChannelData(0);
const renderer = await GPUWaveformRenderer.create(canvas, channelData);
renderer?.render(800, 0, canvas.width, canvas.height);
}