I'm particularly interested in the onaudioprocess of the ScriptProcessorNode (until recently called JavaScriptNode). It is an event listener which is called periodically for audio processing. Does it run in a separate thread?
I'd want to feed the data to a circular buffer and process it outside of this callback so I don't hog the CPU. I can use web workers for the async processing, but AFAIK I'd need a different implementation of the ring buffer in the case of different threads.
Is there a way to test this?
I'm particularly interested in the onaudioprocess of the ScriptProcessorNode (until recently called JavaScriptNode). It is an event listener which is called periodically for audio processing. Does it run in a separate thread?
I'd want to feed the data to a circular buffer and process it outside of this callback so I don't hog the CPU. I can use web workers for the async processing, but AFAIK I'd need a different implementation of the ring buffer in the case of different threads.
Is there a way to test this?
Share Improve this question edited Nov 9, 2012 at 9:04 Davorin asked Nov 8, 2012 at 11:40 DavorinDavorin 1,2041 gold badge16 silver badges31 bronze badges 1- are you also streaming the audio using web sockets ? How did you get on ? ... on side node would be nice to launch Web Audio from inside a Web Worker thread ! – Scott Stensland Commented Aug 26, 2014 at 22:52
4 Answers
Reset to default 4All JavaScript is single-threaded, synchronously executed. Any asynchronous stuff is done via events, which add their handlers to the task queue - to be executed when the current task is finished.
To use separate threads, you would need an environment like WebWorkers - every thread has its own execution context (global scope) and task queue; munication between them is done via events.
As the onaudioprocess
handler seems to live in the same scope as the DOM, it is quite unlikely that it runs in its own thread. If you really have a putationally intensive task that makes your page unresponsive, you should use a WebWorker into which you feed the audio events:
myScriptProcessorNode.onaudioprocess = myWebWorker.postMessage;
With Bergi's solution, you're going to run into issues with the structured clone algorithm not being able to copy read only parameters in the audioProcessingEvent. What you need to do is break out the parts you need from the event that are cloneable, and pass them over to your worker in a different data structure like so:
_onAudioProcess(audioProcessingEvent) {
const {inputBuffer, outputBuffer} = audioProcessingEvent;
// The output buffer contains the samples that will be modified and
// eventually played, so we need to keep a reference to it.
this._outputBuffer = outputBuffer;
const numChannels = inputBuffer.numberOfChannels;
const inputChannels =
Array.from({length: numChannels}, (i) => {
return inputBuffer.getChannelData(i);
});
this._worker.postMessage({
mand: 'DO_STUFF',
inputChannels: inputChannels,
});
}
You're also going to need to access the reference to your outputBuffer in setMessageHandler, in order to copy the processed data back to be eventually played by the user.
Unfortunately, it seems like not even WebAudio's normal nodes run in a separate (non-UI) thread, as there's heavy crackling etc. when there is heavy UI (DOM, CSS, etc.) updates.
Javascript called from a ScriptProcessorNode
's event listener is run in the main thread. Use Audio Worklets to run in a separate thread.
As of 2024, ScriptProcessorNode
has been deprecated, and developers are encouraged to use Audio Worklets.
For pleteness, Web Audio API mandates that processing be split between 3 threads:
- the "control" thread, which is your Javascript engine main thread
- "decoding" threads, for demuxing, codec depression, resampling, etc
- the "rendering" thread, for processing AudioNodes/AudioWorklets and municating w/ the operating system's audio backend