Streams and Buffers in Node.js

Explained with examples

To handle and manipulate streaming data like a video, a large file, etc., we need streams in Node. The streams module in Node.js manages all streams.

In this article, we will focus on the following concepts, related to streams in Node.js.

Topics Covered in This Article

  1. Types of streams in Node.js.
  2. Buffers in streams.
  3. Stream and event emitters.
  4. A read stream.
  5. Flowing and non-flowing read stream.
  6. Buffer management by the read stream.

Types of Streams

In Node, there are four different types of streams:

  • Readable streams → To create a stream of data for reading (say, reading a large file in chunks).
  • Writable streams → To create a stream of data for writing (say, writing a large amount of data to a file).
  • Duplex streams → To create a stream that is both readable and writable at the same time. We can read and write to a duplex stream (say, a socket connection between a client and a server).
  • Transform streams → To create a stream that is readable and writable, but the data can be modified while reading and writing to the stream (say, compressing data by the client and server before while requesting).

Buffers in Streams

Streams work on a concept called buffer.

A buffer is a temporary memory that a stream takes to hold some data until it is consumed.

In a stream, the buffer size is decided by the highWatermark property on the stream instance which is a number denoting the size of the buffer in bytes.

A buffer memory in Node by default works on String and Buffer. We can also make the buffer memory work on JavaScript objects. To do so, we need to set the property objectMode on the stream object to true.

If we try to push some data into the stream, the data is pushed into the stream buffer. The pushed data sits in the buffer until the data is consumed.

If the buffer is full and we try to push data to a stream, the stream does not accept that data and returns with a false value for the push action.

Streams and EventEmitters

Streams extend EventEmitters

Node.js streams extend the EventEmitter class. We can listen to events like data and end in streams.

To simply listen to an event, we need to use the stream.on() function available in the stream.

To read more about EventEmitter’s in Node.js, read the following article:

Read Streams in Node.js

A stream that is used to read the streaming data is called a read stream. A read stream can be reading a file from a server, or streaming an online video.

An analogy that I found in many places is to think of a readable stream like a faucet, and the analogy fits perfectly for this stream. The faucet will have water (or, the data) passing through it that is being consumed by someone.

In this section, we will be creating a read stream from a large text file to see it in action.

When executed, the following code will give the output as:

Stream opened...
---------------------------------
<Buffer 4c 6f 72 65 6d 20 69 70 73 75 6d 20 64 6f 6c 6f 72 20 73 69 74 20 61 6d 65 74 2c 20 63 6f 6e 73 65 63 74 65 74 75 72 20 61 64 69 70 69 73 63 69 6e 67 ... >
---------------------------------
---------------------------------
<Buffer 74 20 6e 75 6e 63 20 76 69 74 61 65 20 66 65 72 6d 65 6e 74 75 6d 2e 20 49 6e 20 75 74 20 61 72 63 75 20 74 65 6d 70 6f 72 2c 20 66 61 75 63 69 62 75 ... >
---------------------------------
---------------------------------
<Buffer 20 76 69 74 61 65 2c 20 65 67 65 73 74 61 73 20 69 64 20 73 65 6d 2e 20 44 6f 6e 65 63 20 75 74 20 75 6c 74 72 69 63 69 65 73 20 6c 6f 72 65 6d 2c 20 ... >
---------------------------------
Stream Closed...

We get the buffer data, which is nothing but the byte data for the content that is in the buffer memory of the stream.

Pause and resume a readable stream

You can also pause and resume a stream in Node.js by simply calling the pause() and resume() function on the stream.

As a result of calling the pause() function, the data event does not get triggered until we call the resume() function of the stream.

Flowing and Non-Flowing Streams

There are two types of readable streams:

  • Flowing stream — A stream that keeps on passing the data that can be directly listened to by using the data event on the stream.
  • Non-flowing stream — A stream that does not push data automatically. Instead, the stream stores the data in the buffer and we need to explicitly call the read() method of the stream to read it.

The above code was an example of a flowing stream as we were just listening to the data event of the stream and the listener gets triggered automatically every time some new data comes up from the stream.

The following is a simple example of a non-flowing stream:

When running the code, it will give the following output:

<Buffer 4c 6f 72 65 6d 20 69 70 73 75>

But how did it work?

It’s because we created a read stream using the createReadStream method from the FS module.

As soon as the stream gets created, the file data will start streaming into the stream variable. We also allowed some time for the stream, using the setTimeout method, to get some data filled in its buffer.

After 10 milliseconds, the setTimeout callback gets executed and we read the first 10 bytes of the buffer using the read() method called with 10 (size in bytes) as an argument.

Buffer Management by the Read Stream

In the above code, if we call the read() function again after the console.log(data) and print the new data, we see that the data is different from the previous log:

For the above code, we get the output as:

<Buffer 4c 6f 72 65 6d 20 69 70 73 75>
<Buffer 6d 20 64 6f 6c 6f 72 20 73 69>

The logged values are different because the buffer removes the data after being read by the consumer.

Hence, for the first call of the read() method, we get the first 10 bytes of buffer data and for the second call of the read() method, we get the 11th byte to 20th byte of the actual data that are currently the first 10 bytes of the buffer.

Conclusion

That was all about read streams in Node.js.

To prevent this piece from getting really long and boring, I have written another blog explain about writable streams, stream piping, error handling for piped streams, and stream events and functions.

Loved the article? SUPPORT MY WRITING…

Patreon — https://www.patreon.com/kunaltandon
Paypal — https://www.paypal.com/paypalme2/kunaltandon94

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Kunal Tandon

Kunal Tandon

987 Followers

Coder • Blogger • Problem Solver • 💁🏼‍♂️ Connect with me on LinkedIn @ http://bit.ly/kunal-tandon