Understanding Node Streams
This article is focussed on understanding the Streams in Node. Streams are one of the most misunderstood and underutilized concepts in Node. Node is an Asynchronous and Event-Driven model. If we need to work with a large amount of data transfer in I/O operations, we can probably use Node Streams.
Follow us on our YouTube Page:
Welcome to TechnoFunnel. In this channel, we are focusing on learning different Frontend and Backend technologies like…
Why Exactly we require Streams in Node.js
Streams are a fundamental concept that offers a lot of performance and power to Node Application for performing a large amount of data transfer.
The unique capability of Streams is that instead of reading all the data once in the memory, we are reading and processing a small chunk of data. This way we are not storing and processing the entire file at once.
One of the examples is Streaming in “YouTube” and “Netflix”. These services do not make the video and audio download on your system first all at once before it starts streaming. These applications start receiving video and audio as a continuous flow of data chunk making it available to the recipient almost immediately. If these websites start waiting for the entire video and audio to download first before streaming, it will probably take too long for them to download and run. Also, it will start consuming a lot of memory to store the entire stream of audio and video data.
Let's consider an example of large file systems, assuming that there might be certain files whose size might be in some GBs (assuming 2 GB). If we are using the traditional approach, we need to load the entire file all at since in the memory. Since we need to have the entire file of 2GB before processing it, we need to save this file in the memory, increasing the overall memory usage of the application.
Instead, using Node, we can load the file in chunks and process the data as it is received eliminating the need for loading the entire file in the memory. The data from the file is received, processed and the memory is released. Thus reducing memory utilization.
Advantages of Streams
The major advantages of working with Streams are:
- Memory Efficiency: need not to load a large amount of data in the memory before you can process the data. Reducing Memory Utilization
- Time Efficiency: We don't have to wait for entire data to be transmitted, the file is received in chunks and each chunk is individually processed
Working with Readable Streams
Let's start working with the Readable stream of data. We will try to put down a simple scenario of Reading a large file from the file system using streams. Given below is the sample code to read file content from large files and display the content in the console.
In the code above, we are triggering a file read request using streams, Streams work on the event model, where the content from the file is extracted in chunks (Subset of File) and can be processed immediately.
Stream emit an event called “data”, once the file chunk (a subset of file) is available, we can perform some operation on the chunk of data available. Hence we are not waiting for the entire file to load first and neither we are extracting the entire file in memory before working on it. Since saving time and memory.
Given below is the link for live code, you can try updating and executing the code mentioned above and evaluate the benefit of Streams.
Working with Writable Streams
In the code above we were reading the data from a file in chunks. Next, we are going to introduce Writable Streams where the data can be added to the file in Chunk. In the example below, we would like to create a writable stream, so that we can write data in the file in chunks, rather than all at once.
In the code below, we are creating a writable stream to the file “sampleFile.txt”, this stream can then be used to write data to the file using the “write” function. Stream functions are easy to use and help achieve a lot of efficiency and reduced memory footprints.
You can try to work with the online editor to run this code and evaluate the execution of Streams. We can extract data periodically and write the data to the streams using Streams.
Copy File Content from One File to Other
The last piece of code that we would like to put on this article, is the ease of copying the file content from one file to another using file streams. Let's look at the code which can help up achieve this functionality using Streams.
In the code below, we have created a readable as well as a writable stream, the readable stream is reading a file from the file system and copying the file content to “copiedFile.html”. Readable File stream is reading the file in chunks and as soon as the chunk is read, it saves the chunk to the new file. Hence at any moment of time, we need not have the entire file in the memory first in order to paste it to another file.
Also, we have added another event to the read stream to capture the event “end”, which signifies that the file is completely read by the Stream. This event marks the completion of the file read event Stream. In the editor below, you can try to modify the code and put some random code to the same.
Node streams can offer amazing benefits in terms of memory optimization and efficiency. Although the concept seems to be a bit confusing since we need to deal with events, it is of great benefit. Try using the same in your code next time.
We are also always interested in helping to promote quality content. If you have an article that you would like to submit to any of our publications, send us an email at email@example.com with your Medium username and we will get you added as a writer. Also let us know which publication/s you want to be added to.