What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video>
while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is pletely downloaded.
It's easy to get a ReadableStream
from the Fetch API (response.body
), yet I can't find a way to feed it into the video
element. I've figured out I need a blob
URL for this, which can be created using a MediaSource
object. However, the SourceBuffer#appendStream
method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource
object.
I can probably read the stream in chunks, create Uint8Array
s out of them, and use SourceBuffer#appendBuffer
, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?
Are there probably other ways to create a blob URL for a ReadableStream
? Or is there a way to make fetch
and <video>
share a request? There are so many new APIs that I could easily miss something.
What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video>
while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is pletely downloaded.
It's easy to get a ReadableStream
from the Fetch API (response.body
), yet I can't find a way to feed it into the video
element. I've figured out I need a blob
URL for this, which can be created using a MediaSource
object. However, the SourceBuffer#appendStream
method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource
object.
I can probably read the stream in chunks, create Uint8Array
s out of them, and use SourceBuffer#appendBuffer
, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?
Are there probably other ways to create a blob URL for a ReadableStream
? Or is there a way to make fetch
and <video>
share a request? There are so many new APIs that I could easily miss something.
- See HTML5 audio streaming: precisely measure latency?, Synchronize video and audio (preferably without JavaScript) – guest271314 Commented Dec 19, 2016 at 9:54
1 Answer
Reset to default 3After hours of experimenting, found a half-working solution:
const video = document.getElementById('audio');
const mediaSource = new MediaSource();
video.src = window.URL.createObjectURL(mediaSource);
mediaSource.addEventListener('sourceopen', async () => {
const sourceBuffer = mediaSource.addSourceBuffer('audio/webm; codecs="opus"');
const response = await fetch(audioSRC);
const body = response.body
const reader = body.getReader()
let streamNotDone = true;
while (streamNotDone) {
const {value, done} = await reader.read();
if (done) {streamNotDone = false; break;}
await new Promise((resolve, reject) => {
sourceBuffer.appendBuffer(value)
sourceBuffer.onupdateend = (() => {
resolve(true);
})
})
}
});
It works with https://developer.mozilla/en-US/docs/Web/API/MediaSource
Also, I tested this only with webm/opus format but I believe it should work with other formats as well as long as you specify it.