Build A Mobile and Web Video App for Broadcasting An Interactive Live Stream

Devin Dixon
BingeWave

--

We continue to grow in a virtual society where people can freely broadcast themselves to the world and create new interactive experiences. Apps now and the future will be built to fit different niche needs for live experiences that can be for:

  • Live Shopping
  • Musicians with Live Music
  • Influencers and Followers
  • Podcasting
  • Training Sessions

And many other new and unique experiences that have not been thought of. This article will cover how to make an app that you can broadcast these new experiences to the world.

Code And Requirements

Please read the information below on how to download the code and the developer requirements.

https://github.com/BingeWave/Building-A-One-To-Many-Video-App-for-Broadcasting-Interactive-Live-Streaming

Developer Requirements

To create this app, we will be using the following tools and ask that the developer already know the basics of:

  • Command Line
  • React Native
  • Expo
  • Javascript

Download The Code

The code in this tutorial is available online at our Github repository and can be downloaded and ran on your local computer. To get the code, please visit:

https://github.com/BingeWave/Building-A-Live-Broadcast-Web-and-Mobile-App

Organizer Account

Like all of our tutorials, our starting point will be a organizer account. This will allow you to access all our API features. Please read the tutorial on Setting up an Organizer Account if you have not already; it should only take a minute or two.

Understanding Broadcasting is One-To-Many

Broadcasting is when one person (or a few people) showcases a video feed to a large number of people.

For scalability, two different technologies are used to accomplish this:

  1. WebRTC for the actual video chat.
  2. An RTMP Stream or HLS Stream to a group of people (there is also DASH).

There are several reasons for using a combination of WebRTC and Streams (RTMP and HLS). WebRTC for video calls can be very resource-intensive because of the requirements around streaming video, which will increase your costs. Users who are just watching and not interacting on the video call do not need a WebRTC connection. Therefore, they can watch via a stream.

Streams are when the WebRTC video call gets converted to an RTMP stream, and the stream is sent to other sources. Eventually, the stream is converted to an HLS where the end-user consumes it.

Our technology makes this process seamless for development. We will handle the WebRTC call, streaming to endpoints, and user consumption by various formats such as HLS. The developer just as to implement our widgets.

We will create a React Native mobile application and web application to illustrate how this works on both ends.

Create Your Live Event

To begin this tutorial, we will need to create a live event. All video streams, audio streams, and AR/VR Sessions on BingeWave are live events. The live event will have all functionality needed for your stream. For creating a live event, the documentation is available here:

https://developers.bingewave.com/docs/events#create

There are several types of live event types visible here:

https://developers.bingewave.com/types#event_types

We want an instant event because all the other event types will require a date/time for the live event to take place. The event type where we are going to choose is 7, for an Instant Event.

Looking again through the live events documentation, you will note a few important options here.

  • video_chat_layout_host
  • video_chat_layout_participant

Referring to the video layout section, there are several different layout options for the video section.

https://developers.bingewave.com/types#video_chat_layouts

We are going to use the ‘SINGLE_USER_AUDIENCE_CAROUSEL_RIGHT’. This will cause the host to be the main video and everyone else who might join to appear on the right side of the screen.

Next, we will use the Query Builder to create the event. The Query Builder is a specialized tool in our API that allows you to run queries against the API before writing code. In the Query Builder, insert the event type and select your organizer account id. For example, the Query Builder for creating a live event should like below:

https://developers.bingewave.com/queries/events#create

When you click ‘Send’ and if the response was successful, you would receive a JSON object of the live event that was just created.

We want to pay attention to the following:

  • id: The id of the live event. It will be used several times in this article.
  • embed_video_chat: This field is a tag for video chat software a user will join and be the “host”. This field goes into HTML websites.
  • embed_broadcast: This field is a tag for a broadcast, which will be where people can watch the host in the video chat. This field goes into HTML web pages.
  • webview_video_chat: This field is a tag for video chat software a user will join and be the “host” that will go into mobile applications
  • webview_broadcast: This field is a tag for a broadcast, which will; be where people can watch the host in the video chat in mobile applications.

Add A Broadcast Widget To The Live Event

In order to have a user stream their video, we need to activate the broadcast feature. We can accomplish this in three different ways:

  1. We can start and stop a broadcast with the API endpoint here. This for power users with their own advanced custom setup.
  2. You can use a pre-built widget for Broadcasting.
  3. You can create your own widget in our Widget builder.

For the brevity of this tutorial, we will use the built-in widget. Widgets are apps that can be overlayed on the screen to add interactivity to your experience.

The widget system is one of the most powerful features of BingeWave is the degree of customization in every live event. Everything from the layout to what appears on screen and where can be fully customized by the developer. This customization is accomplished through a grid layout and widget system.

You can read more about your Video Layout here, but imagine your video screen is laid out in a grid as such:

We can place one or multiple widgets in every location on the grid. We are going to get a list of widgets and find the Broadcast Widget here:

https://developers.bingewave.com/queries/widgets#list

Running the query will return a list of available widgets. The Broadcast Widget id is ‘08e71a92-a7e3–4519–9cb7-aa30b237ed52’.

Now comes adding the widget to the interface of the live event. The documentation for adding a widget is here: https://developers.bingewave.com/docs/eventwidgets#addwidget

We want to focus on three fields:

  1. accessible_for_host: Will enable the widget for users with the role of host.
  2. accessible_for_participants: Will enable the widget for the participants, which is the default role.
  3. position: We set where the widget goes on-screen. Remember to look at the position charting.

https://developers.bingewave.com/types#widget_positions

We are going to place the broadcast widget in the center foot, which has a value of 18. In our Query Builder, implement the following:

https://developers.bingewave.com/queries/eventwidgets#addwidget

Remember to set the widget_id with Broadcast Widget above, and replace the id in the route with your Live Event id. Now we have placed the widget to appear on-screen! We can now move on to your React Application to build the broadcasting and stream feature.

Application Structure

Before creating our web and mobile apps, let us add a little structure to this project. On your command line, create a folder for this project.

mkdir broadcast_streaming_appcd broadcast_streaming_app

This folder will act as our root folder for the project. In this folder, we are going to create a directory for the web and mobile apps.

mkdir webmkdir mobile

And that’s it! We will use these two folders to build our application in the next steps.

Building Your React Application

The first part of the application we will build is a react-native app with Expo. The mobile able will have two parts:

  1. A section where the host will “Broadcast” their stream to others.
  2. An area where users watch the stream.

First, let us start by installing Expo globally if you have not already:

npm install — global expo-cli

Now we are going to create an expo project inside the mobile folder.

expo init mobile
cd mobile

When prompted, choose the minimal with a template. Expo has created our application; let’s go ahead and install some dependencies r.

npm install react-native-webview --savenpm install @react-navigation/native @react-navigation/native-stack --saveexpo install react-native-screens react-native-safe-area-context

The above packages are:

  1. A Webview that will show showcase the interface on-screen.
  2. Navigation to move between screens

Next, we are going to build our file directory. Go into the broadcast-livestream folder and create a screens folder. We are going to make a screen three different screens for this app.

mkdir screenstouch screenings/Home.jstouch screens/Chat.jstouch screens/Stream.js

Home Screen

The first screen to build is the Home Screen. It will be a straightforward screen for navigating either to the watch screen or broadcast screen. The code will be as such:

Chat Screen

The Chat Screen will take in the user’s camera and output the contents to a stream. In The Chat.js, implement the following code:

The only thing we need is a Webview. In the Webview, we have:

  • source: The URL is retrieved from the webview_video_chat discussed above. Replace the xxxx with ID of the live event.
  • style: Style for the webview
  • javasScriptEnabled: Allow the content that the webview is pulling in to have Javascript enabled
  • allowsInlineMediaPlayback: But default, any media played will try to go full screen. This keep the media playing in its current container.

There is one more page to implement, the Watch Page.

Stream Page

The Stream page is where users will watch as the audience. As discussed above, the audience video will not be captured, and they will get the RTMP/HLS stream. The code is also straightforward:

Again, it is just a Webivew, and we are using the webview_broadcast retrieved from the live event in our API call. Replace xxx with id of the live event. Now we need to replace the main App.js file.

App.js

App.js is the main entry point for our app. We want this file to load our navigation and screens. The code for doing so is below:

The break down of this code is all follows:

  1. We import all the screens (Home, Broadcast, Watch).
  2. We call our stack navigator to create the navigation.
  3. We set up a StackNavigator and implemented the Screens.

Now we are ready to test our app. Start Expo with the following:

expo start

When Expo starts, either choose the iOS or Android simulator. When the apps loads, go to the Chat page and join the chat. You will see the Broadcast button on the bottom.

Important: On simulators, there will not be any video but the Bop/Blip screen.

Click it to you will start broadcasting. Congratulations! In a separate session, navigate to the stream page, and you will see yourself. Now let’s put it into a web app.

Web App For Implementation

The web applications will be for browsers to access and stream the content. Like the mobile app, it will be very straightforward. We will have two files, a Chat page and a Stream page.

Start by going back to the project’s root directory and do the following on the command line:

touch web/Chat.jstouch web/Stream.js

Inside the HTML web pages, we will use BingeWave’s embed system. The embeds can be read about here:

https://developers.bingewave.com/javascript/bwconnector

How it works is custom script tags in the format of:

<bw:widget></bw:widget>

are placed around a webpage. When the Connector is called with BingewaveConnector.init(), the tags are then turned into BingeWave’s interface. We will use that to define the pages. In the previous step we retrieved the embed_video_chat from the Live Events JSON that returned a tag like this:

<bw:widget env=”prod” type=”webrtc” id=”xxxx”></bw:widget>

The Chat page’s HTML will look as such:

The watch page is very similar. It will use the same BingewaveConnector and the tag for broadcasting a stream returned from the JSON as the embed_broadcast:

<bw:widget env=”prod” type=”broadcast” id=”xxxx”></bw:widget> 

Replace xxx with the id of your live event. Your final code will resemble this:

Open up each page in your browser. In the Chat page, open up in a browser, join the video and start a broadcast. On the Stream page, you will be able to watch the stream. Likewise, the stream will be showing on mobile devices as well.

Congratulations! You now have an app where user(s) can broadcast themselves from there phone or website to the world!

getUserMedia Undefined Error

On your local host in your web browser, you might run into this issue:

What is happening is the your browser is block the ability to get the video because it is not a secure connection. To solve this error, read this article on here on WebRTC Fix For getUserMedia Undefined On Local Hosts.

--

--