最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Chrome: to play a video that is being downloaded via fetchXHR - Stack Overflow

programmeradmin5浏览0评论

What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video> while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is pletely downloaded.

It's easy to get a ReadableStream from the Fetch API (response.body), yet I can't find a way to feed it into the video element. I've figured out I need a blob URL for this, which can be created using a MediaSource object. However, the SourceBuffer#appendStream method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource object.

I can probably read the stream in chunks, create Uint8Arrays out of them, and use SourceBuffer#appendBuffer, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?

Are there probably other ways to create a blob URL for a ReadableStream? Or is there a way to make fetch and <video> share a request? There are so many new APIs that I could easily miss something.

What I'm trying to achieve is to make Chrome load a video file as data (via the Fetch API, XHR, whatever) and to play it using <video> while it's still being downloaded without issuing two separate requests for the same URL and without waiting until the file is pletely downloaded.

It's easy to get a ReadableStream from the Fetch API (response.body), yet I can't find a way to feed it into the video element. I've figured out I need a blob URL for this, which can be created using a MediaSource object. However, the SourceBuffer#appendStream method, which sounds like just what is needed, isn't implemented in Chrome, so I can't connect the stream directly to the MediaSource object.

I can probably read the stream in chunks, create Uint8Arrays out of them, and use SourceBuffer#appendBuffer, but this means playback won't start immediately unless the chunk size is really small. Also it feels like manually doing something that all these APIs should be able to do out of the box. If there is no other solutions, and I go this way, what caveats should I expect?

Are there probably other ways to create a blob URL for a ReadableStream? Or is there a way to make fetch and <video> share a request? There are so many new APIs that I could easily miss something.

Share Improve this question edited Sep 5, 2016 at 10:55 thorn0 asked Sep 1, 2016 at 15:24 thorn0thorn0 10.5k6 gold badges71 silver badges101 bronze badges 1
  • See HTML5 audio streaming: precisely measure latency?, Synchronize video and audio (preferably without JavaScript) – guest271314 Commented Dec 19, 2016 at 9:54
Add a ment  | 

1 Answer 1

Reset to default 3

After hours of experimenting, found a half-working solution:

const video = document.getElementById('audio');

const mediaSource = new MediaSource();

video.src = window.URL.createObjectURL(mediaSource);

mediaSource.addEventListener('sourceopen', async () => {

  const sourceBuffer = mediaSource.addSourceBuffer('audio/webm; codecs="opus"');

  const response = await fetch(audioSRC);

  const body = response.body

  const reader = body.getReader()

  let streamNotDone = true;

  while (streamNotDone) {

    const {value, done} = await reader.read();
    
    if (done) {streamNotDone = false; break;}
    
    await new Promise((resolve, reject) => {
      sourceBuffer.appendBuffer(value)

      sourceBuffer.onupdateend = (() => {
        resolve(true);
      })
    }) 

  }
});

It works with https://developer.mozilla/en-US/docs/Web/API/MediaSource

Also, I tested this only with webm/opus format but I believe it should work with other formats as well as long as you specify it.

发布评论

评论列表(0)

  1. 暂无评论