I'm trying to find out what the best way to play synchronised Audio tracks through the Web Audio API is. What i'm trying to achieve is to play multiple .wav files at once with as little delay in synchronisation of the audio tracks as possible.
The only way i've found to play multiple audio tracks at the same time is to create multiple audio tracks and loop through them in a for loop. The issue with this is that there's a tiny amount of delay between the loops. The delay is only a couple of milliseconds usually depending on the users machine however when I have something like 30 audio tracks that need to start at the same time and my loop has to loop over 30 tracks and call source.start()
on each of them, there is a noticeable delay by the time the loop starts the 30th track.
As I need the tracks to play as on time as possible, I was wondering if there was perhaps another solution. Maybe for example where via the Web Audio API you could load in multiple sources and then have a native global event that would start all those tracks simultaneously.
Here is some code that shows the issue:
const audioBuffer1 = '...'; // Some decoded audio buffer
const audioBuffer2 = '...'; // some other decoded audio buffer
const audioBuffer3 = '...'; // and another audio buffer
const arrayOfAudioBuffers = [audioBuffer1, audioBuffer2, audioBuffer3];
const context = new AudioContext();
function play(audioBuffer) {
const source = context.createBufferSource();
source.buffer = audioBuffer;
source.connect(context.destination);
source.start();
}
for (let i = 0; i < arrayOfAudioBuffers.length; i++) {
// every time this loops the play function is
// called around 2 milliseconds after the previous
// one causing sounds to get slightly out of sync
play(arrayOfAudioBuffers[i]);
}
An example of an app that uses multiple track sources and manages to keep good synchronisation is Splice Beatmaker. I've explored a few libraries such as Howler and Tone but they appear to use the loop approach I believe.
Would love to hear any suggestions as to how to tackle this issue
I'm trying to find out what the best way to play synchronised Audio tracks through the Web Audio API is. What i'm trying to achieve is to play multiple .wav files at once with as little delay in synchronisation of the audio tracks as possible.
The only way i've found to play multiple audio tracks at the same time is to create multiple audio tracks and loop through them in a for loop. The issue with this is that there's a tiny amount of delay between the loops. The delay is only a couple of milliseconds usually depending on the users machine however when I have something like 30 audio tracks that need to start at the same time and my loop has to loop over 30 tracks and call source.start()
on each of them, there is a noticeable delay by the time the loop starts the 30th track.
As I need the tracks to play as on time as possible, I was wondering if there was perhaps another solution. Maybe for example where via the Web Audio API you could load in multiple sources and then have a native global event that would start all those tracks simultaneously.
Here is some code that shows the issue:
const audioBuffer1 = '...'; // Some decoded audio buffer
const audioBuffer2 = '...'; // some other decoded audio buffer
const audioBuffer3 = '...'; // and another audio buffer
const arrayOfAudioBuffers = [audioBuffer1, audioBuffer2, audioBuffer3];
const context = new AudioContext();
function play(audioBuffer) {
const source = context.createBufferSource();
source.buffer = audioBuffer;
source.connect(context.destination);
source.start();
}
for (let i = 0; i < arrayOfAudioBuffers.length; i++) {
// every time this loops the play function is
// called around 2 milliseconds after the previous
// one causing sounds to get slightly out of sync
play(arrayOfAudioBuffers[i]);
}
An example of an app that uses multiple track sources and manages to keep good synchronisation is Splice Beatmaker. I've explored a few libraries such as Howler and Tone but they appear to use the loop approach I believe.
Would love to hear any suggestions as to how to tackle this issue
Share Improve this question asked Jul 23, 2019 at 0:15 red house 87red house 87 2,41512 gold badges61 silver badges110 bronze badges 3 |4 Answers
Reset to default 17 +50Due to browsers protecting against fingerprinting and timing attacks, timing precision under the hood can be reduced or rounded by modern browsers.
This would mean source.start(offset)
could never be 100% accurate or reliable in your case.
What I recommend is mixing down the sources byte by byte then playing back the final mix.
Assuming all audio sources should start at the same time, and time till load is flexible the following will work:
Example:
const audioBuffer1 = '...'; // Some decoded audio buffer
const audioBuffer2 = '...'; // some other decoded audio buffer
const audioBuffer3 = '...'; // and another audio buffer
const arrayOfAudioBuffers = [audioBuffer1, audioBuffer2, audioBuffer3];
We'll need to calculate the length of the entire song by obtaining the buffer with the maximum length.
let songLength = 0;
for(let track of arrayOfAudioBuffers){
if(track.length > songLength){
songLength = track.length;
}
}
Next i've created a method that will take in arrayOfAudioBuffers
and output a
final mixdown.
function mixDown(bufferList, totalLength, numberOfChannels = 2){
//create a buffer using the totalLength and sampleRate of the first buffer node
let finalMix = context.createBuffer(numberOfChannels, totalLength, bufferList[0].sampleRate);
//first loop for buffer list
for(let i = 0; i < bufferList.length; i++){
// second loop for each channel ie. left and right
for(let channel = 0; channel < numberOfChannels; channel++){
//here we get a reference to the final mix buffer data
let buffer = finalMix.getChannelData(channel);
//last is loop for updating/summing the track buffer with the final mix buffer
for(let j = 0; j < bufferList[i].length; j++){
buffer[j] += bufferList[i].getChannelData(channel)[j];
}
}
}
return finalMix;
}
fyi: you can always remove one loop by hard coding the update per each channel.
Now we can use our mixDown
function like so:
const mix = context.createBufferSource();
//call our function here
mix.buffer = mixDown(arrayOfAudioBuffers, songLength, 2);
mix.connect(context.destination);
//will playback the entire mixdown
mix.start()
More about web audio precision timing here
Note:
We could use OfflineAudioContext
to accomplish the same thing but precision is not guaranteed and
still relies on looping and calling start()
on each individual source.
Hope this helps.
You could try applying an offset:
function play(audioBuffer, startTime) {
const source = context.createBufferSource();
source.buffer = audioBuffer;
source.connect(context.destination);
source.start(startTime);
}
const startTime = context.currentTime + 1.0; // one second in the future
for (let i = 0; i < arrayOfAudioBuffers.length; i++) {
play(arrayOfAudioBuffers[i], startTime);
}
This code will queue up all sounds to play at the same time, one second in the future. If this works, you can tune down the delay to make the sounds play more immediately, or even calculate the right delay based on the number of tracks (e.g. 2 ms per track * 30 tracks = 60 ms delay)
You never just source.start();
(start means right now) instead you source.start(sometimeinthefuture);
web-audio has its own internal clock and is very precise. and make sure all those tracks you have in that loop have the same sometimeinthefuture.
Also you don't need to mix down a final mix like the answer indicates, you just need to create a scheduling system. Where you schedule via javascript's timeout or setinterval into the future on the web-audio clock.
This is explained in the article a tale of two clocks. There's also a simpler article that explains this. Also check out Looper.js that uses this scheduling system concept to sync an audio looper (where multiple tracks need to be synchronized).
It seems that you want to run all these process like in a parallel mode in order to have zero delay or latency.
The problem is that javascript, unlike java, has a event loop that only can manage a single thread (although it has newer function that are designed to run promises and async-like functions).
- I have done some little research and found This: Parallel.js Easy multi-core processing with javascript.
As they say:
Parallel.js solves that problem by giving you high level access to multi-core processing using web workers. It runs on node and in your browser.
Normally the web-browser use only one web-worker per open tab, but there are some libraries that extend this functions so you can run multiple threads and run parallel process in your web app. I have never tried, but here are some articles so you can start your research:
- Mozilla.org - The Path to Parallel JavaScript
- Itnext.io - Parallel programming in JavaScript using Web Workers
- JonResig.com - Computing with JavaScript Web Workers
Im a very newbie programmer and maybe this answer isnt very good, but I hope it was helpful. Sorry about my english :)
audioBuffer
) that holds all sounds. – VC.One Commented Jul 30, 2019 at 2:31