Real-Time Video Streaming with Kafka
I always thought it was cool to spend the time to code a project with a full video stream, in fact when I searched for all the cool applications out there, I rarely found a well-explained tutorial that would provide me insights into how to build a pipeline that would take a video stream in, and make it available to the application for downstream processing such as facial detection or object classification use case. I mean… I get it, it’s boring to build the foundation but for those who want to take a peak at it, this is your chance.
Let’s get a technical… I decided to build a complete producer-to-consumer demo application for video streaming. I picked the sample video from an RTSP source using an URL stream via a Producer inside the Docker network. The producer is written in Go to send frames to the Kafka topic. Remember that some image resizing is needed to ensure the pipeline doesn’t overload, as as a consequence avoids the Sarama library to throw an exception.
In this app, we have two consumers. One consumer runs inside the container network talking to the Kafka pipe through kaka:9093 port. The other is designed to be run from the localhost accessing the Kafka pipe via localhost:9092. In both cases, I generated a docker image and a local environment containing the same packages, mostly including popular Computer Vision framework (OpenCV) and popular ML/DL frameworks (TF, Keras, Theano, and Caffe). Obviously, you wouldn’t need to use all DL frameworks, but it’s a nice exemplified image to keep in your toolbox.
Replicating the demo:
Let’s get started by cloning the git repo:
$ git clone https://github.com/pborgesEdgeX/full_app.git
$ cd full_app/
$ docker-compose up
When you execute the docker-compose command, the producer, Kafka pipe, and consumers containers are created and begin to execute. You should see an output similar to this:
At this point, you may want to run a consumer from your local host. In this case, you should utilize our bash script:
$ chmod u+x consumer-localhost.sh
Accessing the endpoints:
Once the system is up and running, you can see the consumers running on a Flask-powered web server.
The consumer inside the Docker network can be accessed:
The consumer running from the localhost can be accessed:
Adding your cool ML Algorithms:
If you wish to modify the consumer, go ahead and cd into the consumer folder. In there, you can find the main.py file. Add your own custom ML/DL algorithms where you have direct access to the image pixels in the get_stream() function:
feed = msg.value.get("pix")
Or if you’d like to get in bytes:
b = bytes(feed, 'utf-8')
List of Modules included:
This project serves as a lightweight demo for a video streaming pipeline. You shouldn’t use this project in a production environment, but it’s a nice way to get you started with Kafka, Flask, and the ML libs, all using Python. Happy coding!