Broadcasting a WebRTC video stream
with re-publishing as RTMP
Web Call Server converts a WebRTC audio + video stream to RTMP and sends it to the specified RTMP server. This way, a live broadcasting from a page on Facebook, Youtube Live or other services that can broadcast live video, can be created.
WebRTC broadcasting with RTMP republishing
to live services and servers
A WebRTC-compatible browser captures video from the camera and audio from the microphone and sends them to the WCS server using a stack of protocols the WebRTC technology envisages (ICE, DTLS, SRTP). Sending is performed with H.264 video codec and Opus audio codec. If the sender is a mobile device, VP8 video codec is used instead of H.264.
The received WebRTC stream is converted to RTMP with H.264 and AAC codecs, and is sent to the given address of the server or the service that supports RTMP broadcasting.
Step-by-step flowchart of WebRTC broadcasting
with RTMP republishing
- The WebRTC browser establishes connection to the WCS server via the Websockets protocol, sends the stream.publish() command that initiates publishing of the WebRTC stream.
- WCS establishes RTMP connection to the broadcasting server or service, for example Facebook Live.
- The browser starts sending audio and video traffic encoded as H.264 + Opus.
- WCS converts the received H.264 video stream to RTMP and transcodes the received audio to AAC for compatibility with the majority of RTMP services. Then, the RTMP stream is sent to the destination.
Example of republishing a WebRTC video stream
as RTMP in the Google Chrome browser
In this example, the WebRTC video is captured from the web camera (at the left), sent to the RTMP address (at the right) and then played in a simple RTMP player on the same page.
Note that in this example, the RTMP video is forwarded to localhost, not the third-party server. This allows to quickly test the example and make sure RTMP republishing work properly.
The detailed description of the test is available in the Testing section.