Stream and Buffer Concepts in Node.js

Anup Sarkar
Tensult Blogs
Published in
2 min readApr 16, 2018

This Blog has moved from Medium to blogs.tensult.com. All the latest content will be available there. Subscribe to our newsletter to stay updated.

So far in the earlier blog, we have already covered a few basic concepts about Node.js like what is Node.js, why we need node.js, what is an NPM package, how to create the server and modules. In this part, we will explain Streams which is an important part of Node.js.

What is Stream?

Streams are objects that lets you read data from a source or write data to the destination in a continuous fashion.

Problems with Large Data:

Speed: Too slow because it has to load all the requests.

Buffer Limit: 1 GB

Stream Benefits:

  1. Abstraction for continuous chunking of data.
  2. No need to wait for the entire resource to load.

Stream is Used In:

  1. HTTP request & responses
  2. Standard input/output(stdin & stdout)
  3. File reads and write

Types of Stream :

Readable Stream :

Readable Stream is used for read operations.

Standard input streams have data going into the applications. This is achieved through the read operation. Input typically comes from the keyboard used to start the process.

const fs = require('fs');
let data = '';

// Create a readable stream
let readableStream = fs.createReadStream('input.txt');

// Set the encoding to be utf8.
readerStream.setEncoding('UTF8');

// Handle stream events --> data, end,
readableStream.on('data', function(chunk) {
data += chunk;
});

readableStream.on('end', function(){
console.log(data);
});

Writable Stream:

Writable Stream is used for write operations.

Standard output streams contain data going out of the applications. To write to stdout, we use the write function.

process.stdout.write('A Simple Message \n');

Duplex Stream :

This is a Stream that can be used for both read and write operations.

Transfer Stream:

A type of duplex stream where the output is computed based on input.

What is Piping in a Stream?

Piping is a process in which we provide the output of one stream as the input to another stream. It is normally used to get data from one stream and to pass the output of that stream to another stream. There is no limit on piping operations.

const fs = require('fs');

// Create a readable stream
let readableStream = fs.createReadStream('input.txt');

// Create a writable stream
let writeableStream = fs.createWriteStream('output.txt');

// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);

console.log('End of the Process');

Buffer in Node.js:

Node.js provides Buffer class which provides instances to store raw data.

We can create a Buffer in the following way…

//create an uninitiated Buffer of 10 octets
let bufferOne = new Buffer(10);
//create a Buffer from a given array
let bufferTwo = new Buffer([10, 20, 30, 40, 50]);
//create a Buffer from a given string
let bufferThree = new Buffer('Simply Easy Learning');

Working on Buffer:

let buffer = Buffer.alloc(26);
for(let i=0; i<26; i++){
buffer[i]=i+97;
}
console.log(buffer.toString('utf8'));// a, b, c.....z

In the next blog I will cover the Express Framework and Advanced routing concepts.

--

--