最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - Playing MediaStream using AudioContext.createMediaStreamSource vs HTMLAudioElement.srcObject - Stack Overflow

programmeradmin1浏览0评论

I'm trying to play MediaStream from remote peer (WebRTC) using Web Audio API. When i attach the stream to audio element using audio.srcObject = stream it plays ok, but when i try to use AudioContext it does not play any sound at all (I need to avoid audio/video HTML tag).

This piece works:

<audio controls>
<script>
   const audioEl = document.getElementsByTagName('audio')[0];
   audioEl.srcObject = MY_STREAM;
   audioEl.play();
</script>

This one does not:

const audioContext = new AudioContext();
const sourceNode = audioContext.createMediaStreamSource(MY_STREAM);
sourceNode.connect(audioContext.destination);
// Trying even 'audioContext.resume()' after user gesture with no luck

What is weird about that is that when MY_STREAM is my micriphone then it plays nicely for Web Audio API (i hear the feedback from my mic).

So it would suggest that there is something different between microphone MediaStream and the one i get from WebRTC connection but why DOES it play for simple HTML audio tag?

I'm trying to play MediaStream from remote peer (WebRTC) using Web Audio API. When i attach the stream to audio element using audio.srcObject = stream it plays ok, but when i try to use AudioContext it does not play any sound at all (I need to avoid audio/video HTML tag).

This piece works:

<audio controls>
<script>
   const audioEl = document.getElementsByTagName('audio')[0];
   audioEl.srcObject = MY_STREAM;
   audioEl.play();
</script>

This one does not:

const audioContext = new AudioContext();
const sourceNode = audioContext.createMediaStreamSource(MY_STREAM);
sourceNode.connect(audioContext.destination);
// Trying even 'audioContext.resume()' after user gesture with no luck

What is weird about that is that when MY_STREAM is my micriphone then it plays nicely for Web Audio API (i hear the feedback from my mic).

So it would suggest that there is something different between microphone MediaStream and the one i get from WebRTC connection but why DOES it play for simple HTML audio tag?

Share Improve this question edited Feb 19, 2019 at 10:37 Adam Szmyd asked Feb 19, 2019 at 8:08 Adam SzmydAdam Szmyd 2,9833 gold badges26 silver badges35 bronze badges 9
  • Most probably, you are not generating your AudioContext from an user-trusted event. You need to resume() it in Chrome (rules are somehow stricter for AudioContext than for MediaElements). Also, why can't you use a MediaElement? – Kaiido Commented Feb 19, 2019 at 8:31
  • I've tried making .resume() after some click but it didn't helped. FWIW AudioContext.state property is running all the time (developer.mozilla/en-US/docs/Web/API/BaseAudioContext/state) – Adam Szmyd Commented Feb 19, 2019 at 10:40
  • 1 And you can reproduce on different browsers? – Kaiido Commented Feb 19, 2019 at 10:44
  • Firefox and Chrome behaves the same – Adam Szmyd Commented Feb 19, 2019 at 11:23
  • 1 Looks like a Chrome bug to me. Wfm in Firefox. – jib Commented Feb 19, 2019 at 14:29
 |  Show 4 more ments

1 Answer 1

Reset to default 5

As demonstrated by @jib, this is a Chrome bug.
I have opened a new issue to let them know about it.

I thought I found a workaround by simply assigning this MediaStream to the srcObject of a dummy HTMLAudioElement,

new Audio().srcObject = mediaStream;

but somehow, while testing on my localhost it didn't persist in time, while in this fiddle it does.

I also encountered a lot of other weird behavior while playing around, like different tabs having incidence on others and things like that.

Add to that other non-fixed bugs in the area that make me think of false positives and all-in-all, I fear there is no proper solution than waiting for them to fix it...

发布评论

评论列表(0)

  1. 暂无评论