Live Streaming has come a long way from having minutes of latency to ultra low latency. Live stream has always been hard to do because of different things like; video quality, codecs, encoding decoding, internet speeds, etc. As far the understanding goes, live streaming in the internet currently is done mostly using the WebRTC protocol or HLS and RTMP protocol. There are also other protocols being developed and the live streaming space doesn’t have a clear winner on which protocol to use.
In this story, we are talking about WebRTC streaming with AntMedia server, a streaming server that provides adaptive, ultra low latency streaming, and React Native WebRTC library to publish and view the video. Before we start let’s look at the live streaming architecture we will be using;
We will build a react native WebRTC client that sends the live stream to ant media server and another client to view the sent video.
For this we first need to understand how webrtc works. Or how the connection is established in WebRTC stream.
For a client to establish WebRTC connection with server. We need;
- Get user media to start capturing the media (audio/video) in the publisher device.
- Create a
new RTCPeerConnection()and add the local stream to the peer connection, start gathering ice candidates and create the offer sdp to send to ant media server.
const offer = await peerConnection.current.createOffer();
- Send the offer to server. And receive answer from server to set the remote description. Which will then start connection with server and when connected live stream can be viewed in list of ant media server dashboard.
Let’s create a new react native project and add
react-native-webrtc as the dependency. We will also use
react-navigation to build screens for live streaming and stream viewing.
In the main
App.tsx, let’s setup three different screens as shown below in the code snippet;
Menuwill contain menu to choose between publisher screen and viewer screen.
Publisherwill open live stream publish screen and
Viewerwill open live stream viewing screen.
Menu.tsx will look like;
Publisher.tsx ) will have screen to publish live stream to the server. Before publishing video we need to understand how the ant media signaling works. Full documentation on signaling with web-sockets can be found here. For signaling, there are basically 5 commands; (
takeConfiguration (offer and answer),
stop ) as shown by sequence diagram below;
Let’s write a signaling client for the connection so that we can use it properly in our publisher file. Create a file called
SignalingChannel.ts and add the following contents;
- This signaling channel is a wrapper around websocket connection with callback functions to handle different commands received from the server as discussion in the sequence diagram above.
Since, our signaling channel wrapper is complete. Now let’s start building the publisher.
The code above adds the ui for live stream publishing. States
videoMuted is used for saving the current publisher state and showing different buttons according to the state.
localStream is used to save the reference to peer connection object that sends webrtc stream to the server. Local Stream stores the local camera and audio stream taken from getUserMedia function as shown below;
After completing this step, you can finally view the camera stream loaded in the ui. In the above hook, we are selecting device used for live stream and setting the local stream reference to add later on to our
Let’s initialize our signaling channel for this connection.
- In line #2: When web-socket connection is open we send publish command to ant media server with the stream id that we want to publish on.
- In line #8: When we receive
startcommand from the server we can send our sdp offer so that server can create the answer to the offer. Line #27 handles the answer received from the server and set’s received sdp as the remote description of our peer connection instance.
- In line #19: We are setting the ice candidates received from the server as our
peerConnection‘s ice candidates. (Server and client will negotiate between the proper channel and path to use for webrtc connection using ice protocol)
Now let’s define the
startStreaming function that will initiate the connection with the server when clicking the start button in ui.
This function creates a new peer connection instance if there is none and then when new ice candidates are found they are sent to ant media server with
takeCandidate command. This also creates new offer and adds the offer to local description.
The full code for
Publisher.tsx can be found here. If everything works correctly you will see the live stream broadcasting in ant media server dashboard when start button is pressed.
We use the same webrtc protocol for viewing the video as well. If used some other protocol like; hls ant media server didn’t provide ultra low latency for the live streams.
For playing the stream the commands for signaling are;
stop . First
play command tells sever to start viewing webrtc connection. After server creates offer and sends with
takeConfiguration we can set the received offer as the answer and then create our answer and send to server with
takeConfiguration command. After both server and client gets offer and answer ice candidates are exchanged multiple times before connection is established.
stop command is used to stop live stream viewing. [reference]
The main difference in implementation for viewer with publisher is that we don’t need to set local stream instead remote stream is set when available and displayed from
RTCView component from
startStreaming function; we create a new connection, attach ice candidate handler to send to ant media server. Since, we already will have offer from ant media server for viewing so, we set the remote description, create answer and send the answer to streaming server.
signalingChannel initialization; when connection is opened with
signalingChannel.current.open() client sends
play command and in response server returns the
offer sdp that is then set and answer is sent in line #51
The full code for the
Viewer.tsx can be found here. If everything worked perfectly you can be able to view the live stream if available from ant media server.
To test publisher and viewer simultaneously you will need one device and one simulator or two devices. If everything worked perfectly you can view ultra low latency live streaming with ant media server. The full code for this example can be found in;
This implementation of live streaming is not complete but shows an example of how you can implement webrtc live streaming with react native. There will be problems such as; sound not coming from loud speaker, microphone permissions is asked when we are just viewing the live stream in ios etc. Some of the problems can be solved by adding another library while solving other problems will require you to tinker with
libWebRTC library and