After building an intro on how to upload videos with React Native I would like to go a step further and build live streaming. For me this means that a user should be able to start a video stream that someone else might subscribe to and see a near real-time video. Our first challenge will be to get a stream of video data, the second will be to upload the video as a stream.
The current (1.1.5) version of react-native-camera does not allow you to consume a stream of the recording it makes, so we need to build this functionality ourselves. To do this we need to get noticed once the recording starts (we will need to add this), get a file path of the file being written and watching the file for changes.
As I don’t want to go ahead and do a fork & PR to
react-native-camera right now (partially because I don’t want to patch also the iOS version, I have no device to test this) I decided to use
patch-package by David Sheldrick. It allows me to manipulate the package in my
node_modules/ folder directly and check the changes in as a patch file. I think the changes I needed to do are quite interesting as they go into detail over how to extend a native library relying on the React Native Event system.
Adding the event
First of all, we need to add an event to the ones
react-native-camera already provides. First, we need a name for our event, so we add this line to the Events enum
EVENT_RECORDING_STARTED("onRecordingStarted"), with the first part being the name of the enum and the value being the name of the dispatched event and the name of the React property we are going to set later. Adding the event looks like this in our case:
As Java is a very verbose language, let me show you the places that really matter:
- Line 13: We create a class inheriting from a generic React Native event
- Line 24: We take the path of the video from the obtain method and create a new event out of it that stores the path in a private field
- Line 51: We add the path to the serialization so that it is accessible later on
- Line 48: On dispatch we use
rctEventEmitter.receiveEvent, to send an event with the name we will declare in the Events enum and the data we serialized
Adding a small helper to
RNCameraViewHelper to easily dispatch the event later on is pretty straightforward. We construct the event here and put it into the event dispatcher of the
UIManagerModule of React Native right away. We can get the native module from the react context.
Connecting the Event with the Component
Now that we are set up to dispatch the event, let’s dispatch it. Again, java being verbose hinders us from understanding what happens here, so let me give you a TL;DR here. First, the path where the video is going to be written to is found (Line 7). We give this path to the record method of the
CameraView so it starts recording to that path. If this method returns true the recording could start successfully and that is why we emit the started event afterward (in line 19).
With this event being dispatched, let’s write the last bit glue code to get these events right into React Native. As you can see all we need to do is to set a prop on
RNCamera with the name of our event and if the prop is set on the component we will dispatch the
nativeEvent property on our React Native Event to the user of our library.
This is what we need to do in
react-native-camera to get the path before the video has finished.
Getting the changes on the video file
Now that we have the path at an early enough point in time we can try to get changes on that file while the data is being written to it. For this, I extended the react-native-fetch-blob package. It has a function called
RNFetchBlob.fs.readStream which can be used to stream a file. This is exactly what it does, so let’s take a look at our
The API looks pretty simple, but if you execute it you will run into a problem: While the video is still being written the underlying implementation only reads the file once in the beginning and sends it to the JS context as a stream. The actual data access is atomic instead of being stream based and this is what we are going to change. For this, we need something like the bash command
tail -f that reads the entire file and until it’s aborted gives us the new lines added to the file.
The Java equivalent of this is the
RNFetchBlobFS to use Tailer:
We pass the existing file system implementation to the listener so that we can emit the stream event with the given helper. The handle method is invoked every time a new update is registered, very similar to the
tail -f command.
To start Tailer we need to pass the listener we wrote into a Tailer instance and start a thread with it. It would be better to stop it on abort, but we won’t care about that because the data listener unsubscribes once we stop it.
Pushing the Video
Now that we have an endless stream of data we need to send it to our server. For that, I implemented a simple Web Socket handler on the server that just stores the video in memory until the connection is closed and writes it to our file system afterward. Please note that instead of Web Sockets we could also have used a long-running XHR connection, but for me, Web Sockets were just a bit easier.
On the client side, we now need to add Web Sockets into the upload method.
That’s it, now we get data to the server which can be assembled to a video on the run.
For me, this was a nice challenge, I am sure I did not solve it in the best way possible, but it’s a solution to iterate on. I found it interesting to see that there seems to be no use-case for solving problems like this in React Native, at least the existing libraries don’t seem to cover this problem very well.
What I wanted to show is that even problems that might seem a bit big for React Native like streaming are actually feasible when you take the time to dig into the native code. For a production app we would most likely need to do the very same thing on iOS, too, so be aware there might be some work involved.