Open In App

What is stream and its types in Node.js ?

Last Updated : 04 Apr, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

A stream is a way of data handling that helps us to obtain a sequential output by reading or writing the input (files, network communications, and any kind of end-to-end information exchange). That is, they let you read data from a source or write it to a destination or perform any other specific task uninterruptedly and constantly. The stream is not a unique concept to Node.js and it is a part of Unix for quite a long time. A pipe operator is used to make the programs react with each other by passing streams. Hence, the Node.js stream is used as a basis for all streaming APIs. 

Example: When you are streaming YouTube, Netflix, or Spotify then, instead of the whole content downloading all at once, it downloads in small chunks while you keep browsing. Another example can be chatting on Facebook or WhatsApp where the data is continuously flowing between two people. This is because instead of reading all the data at once in the memory the stream processes it into smaller pieces to make large files easily readable. It is useful because some files are larger than the available free space that you have on your device. Hence, the stream makes such files readable. 

Advantages of Stream:

  • Memory efficiency: Stream is memory (spatial) efficient because they enable you to download files in smaller chunks instead of a whole in the memory before you can process it thus, saving space.
  • Time efficiency: Stream is time-efficient because you start processing the data in smaller chunks so the procedure starts earlier compared to the general way, where you have to download the whole data to be able to process it. Hence, this early processing saves a lot of time.
  • Composable data: Data is composed because of the piping ability of the streams which lets them connect together in spite of however heavy the codes are. It means that the process of one input getting piped to output keeps on happening.

Types of Stream: 

  • Readable stream: It is the stream from where you can receive and read the data in an ordered fashion. However, you are not allowed to send anything. For example fs.createReadStream() lets us read the contents of a file.
  • Writable stream: It is the stream where you can send data in an ordered fashion but you are not allowed to receive it back. For example fs.createWriteStream() lets us write data to a file.
  • Duplex stream: It is the stream that is both readable and writable. Thus you can send in and receive data together. For example net.Socket is a TCP socket.
  • Transform stream: It is the stream that is used to modify the data or transform it as it is read. The transform stream is basically a duplex in nature. For example, zlib.createGzip stream is used to compress the data using gzip.

Different operations in a stream are:

  • Reading from a stream: Create a file named input.txt with the following text: This is a code to learn about the reading from a stream. Filename: main.js 

javascript




const fs = require("fs");
let data = '';
 
// Create a readable stream
const readerStream = fs.createReadStream("input.txt");
// Set the encoding to be utf8.
readerStream.setEncoding("UTF8");
 
// Handling data stream event
readerStream.on("data", function(chunk) {
    data += chunk;
});
 
// Handling end stream event
readerStream.on("end",function() {
    console.log(data);
});
 
// Handling error stream event
readerStream.on("error", function(err) {
    console.log(err.stack);
});


Run the main.js file with the following command:

$ node main.js

The output of the above command is shown below:

This is a code to learn about the reading from a stream.

Writing to a stream Filename: main.js 

javascript




const fs = require('fs');
let data = 'This is a code to learn"
            + " about writing in a stream.';
 
// Create a writable stream
let writerStream = fs.createWriteStream('output.txt');
 
// Write the data to stream with
// encoding to be utf8
writerStream.write(data, 'UTF8');
 
// Mark the end of file
writerStream.end();
 
// Handling finish stream event
writerStream.on('finish', function () {
});
 
// Handling error stream event
writerStream.on('error', function (err) {
    console.log(err.stack);
});


Run the main.js file with the following command:

$ node main.js

After executing the above command, a file named output.txt will be created in the current directory with the following text:

This is a code to learn about writing in a stream.
  • Piping the stream: Piping is an operation or a mechanism where we provide the output of one stream (readable, i.e., the source file) of data as the input to another stream (writable, i.e. the destination file). It is normally used to get data from one stream (i.e. read from source) and pass the output of that stream to another stream (i.e. write to destination) without managing the flow yourself. It is the easiest way to consume streams. There is no limit on piping operations. It is used to process streamed data in multiple ways. For example, reading from one file and writing it to another. Create a file named input.txt with the following text:
This is a code to learn about piping the stream.

Filename: main.js 

javascript




const fs = require('fs');
 
// Create a readable stream
const readerStream = fs.createReadStream('input.txt');
 
// Create a writable stream
const writerStream = fs.createWriteStream('output.txt');
 
// Pipe the read and write operations
// read input.txt and write data to output.txt
readerStream.pipe(writerStream);


Run the main.js file with the following command:

$ node main.js

After executing the above command, a file named output.txt will be created in the current directory with the following text:

This is a code to learn about piping the stream.
  • Chaining the stream: Chaining of the stream is a mechanism of creating a chain of multiple stream operations by connecting the output of one stream with another stream. It is normally used with piping operations. For example, we will use piping and chaining to first compress a file and then decompress the same. 

Filename: main.js 

javascript




const fs = require('fs');
const zlib = require('zlib');
 
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
    .pipe(zlib.createGzip())
    .pipe(fs.createWriteStream('input.txt.gz'));
 
console.log('File Compressed.');


Run the main.js file with the following command:

$ node main.js

The output of the above command is shown below:

File Compressed.

You will find that input.txt has been compressed and it created a file input.txt.gz in the current directory. Now code to decompress the above-created file is shown below: Filename: main.js 

javascript




const fs = require('fs');
const zlib = require('zlib');
 
// Decompress the file input.txt.gz to input.txt
fs.createReadStream('input.txt.gz')
    .pipe(zlib.createGunzip())
    .pipe(fs.createWriteStream('input.txt'));
 
console.log('File Decompressed.');


Run the main.js file with the following command:

$ node main.js

The output of the above command is shown below:

File Decompressed.

You will find that input.txt.gz has been decompressed.



Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads