最新消息:雨落星辰是一个专注网站SEO优化、网站SEO诊断、搜索引擎研究、网络营销推广、网站策划运营及站长类的自媒体原创博客

javascript - UDP stream into webRTC - Stack Overflow

programmeradmin2浏览0评论

I am just starting to learn web/network programming (hence not sure what information is relevant), but my goal is to play a stream video acquired by a puter X to a webpage hosted on puter Y as close to realtime as possible. I currently have an awkward solution that is just about ok for my needs whereby the video acquired on puter X is sent to puter Y through a UDP socket. This data is then sent (via ffmpeg) into a 'fake webcam' created using v4l2loopback which is then read using the getUserMedia(). Doing this I have a choppy-ish video that lags less than 2s.

My question, simply, is whether it is possible to read in the UDP stream directly in the webRTC framework (somehow, and if so how) rather than the going awkwardly through the webcam.

I am just starting to learn web/network programming (hence not sure what information is relevant), but my goal is to play a stream video acquired by a puter X to a webpage hosted on puter Y as close to realtime as possible. I currently have an awkward solution that is just about ok for my needs whereby the video acquired on puter X is sent to puter Y through a UDP socket. This data is then sent (via ffmpeg) into a 'fake webcam' created using v4l2loopback which is then read using the getUserMedia(). Doing this I have a choppy-ish video that lags less than 2s.

My question, simply, is whether it is possible to read in the UDP stream directly in the webRTC framework (somehow, and if so how) rather than the going awkwardly through the webcam.

Share Improve this question edited May 16, 2018 at 14:53 user3235916 asked May 16, 2018 at 14:22 user3235916user3235916 6147 silver badges22 bronze badges 3
  • Do you have control over puter Y and the software that produces the stream? If yes, why go the extra route over UDP and not send it from X to Y via WebRTC in the first place? – schroffl Commented May 16, 2018 at 14:57
  • Hi, thanks for your ment. Unfortunately, I don't really have control at that end. – user3235916 Commented May 16, 2018 at 15:11
  • You could build a UDP server which receives the data from puter X and then, via WebRTC, pipes it to your desired peer (puter Y in your example) – schroffl Commented May 16, 2018 at 15:35
Add a ment  | 

3 Answers 3

Reset to default 5

You can't do that directly with WebRTC since it doesn't expose a raw socket but your options are:

Convert it to an HLS live stream

  • Convert the UDP stream (I assume RTMP? What is ffmpeg outputting?) to an HLS stream on your server via nginx-hls-rtmp.
  • Use hls.js to play said video.

Convert it to a DataChannel and send it over SCTP

  • Set up a media streaming WebRTC server for example wrtc
  • Connect to the web client via WebRTC (for example simple peer can help).
  • Set up a DataChannel that works over SCTP (implemented with libsctp over udp in Chrome Safari and Firefox).
  • Send the data on the data channel and decode it there.

Connect directly from X to Y via WebRTC

  • You can also set up a WebRTC server and connect ffmpeg to that
  • Open a Media channel between the WebRTC 'server' peer and the client.
  • Stream the video.

Of these options - the first is by far the simplest. The disadvantage is that it isn't very suitable for low latency streaming + there is some overhead.

I remend you start with it and work your way up.

in Windows you can use AVObjects directshow filter it support UDP to webrtc. it also supports stun servers

It's possible if you use a WebRTC library. libdatachannel (I'm the original author) will let you open a WebRTC peer connection with a video track and send the RTP packets directly in the track provided the codec is H264 or VP8.

发布评论

评论列表(0)

  1. 暂无评论