Open In App

What is Piping in Node.js ?

Last Updated : 03 Feb, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

Piping in NodeJS is the process by which byte data from one stream is sent to another stream.

What are streams?

Streams in NodeJS are a sequence of bytes. When a program processes some data it needs to read all the data into memory and then process it and write out the data back into some location. However, this approach fails when there is a large amount of data to be processed. To overcome this problem streaming algorithms are used for processing the data in which the data is not read all at once instead it ‘flows’ into the program, is processed, and ‘flows’ out of the program. Here the data is processed in small chunks and the whole data is never held in memory at once. The data stream can be compared with the example of water flowing in a river where a large amount of water is transported by flowing through the river over time. Streaming algorithms are more memory efficient and faster. You can read this article to get a detailed guide on streams.

What is piping?

Piping in NodeJS is useful when we need to write some data coming from a source stream to another stream. In such a case we can use the piping algorithms provided by NodeJS instead of writing our own reading and writing logic. We basically pipe the read stream and write stream to one another. This can be compared with joining two water bodies using a pipe to transfer water.

Let’s take an example on piping, suppose you want to write a server that serves some files. In this case, you will need to write code to handle the reading operation on the files and code to write the data to a network socket. But with the piping facility provided by NodeJS, you can easily connect the file stream to the network socket stream to send the data. There is a pipe() function in NodeJS to handle this operation which is provided with a stream object and accepts a stream object.

Pipe function can be used to connect an input stream to an output stream or it can be included with a combination of streams to create a complex pipeline to execute a certain task.

Make sure you have NodeJS installed on your computer. If not download it from here and then install it. All the modules used in this project are included by default in NodeJS.

Approach 1: Connecting read stream write stream using pipe()

Project Structure: Our project structure would be:

  • index.js: main containing the code.
  • input.txt: file from which data is read
  • output.txt: file to which data is written

 

The following modules are required in this approach:

  • fs: file stream module used for creating and interacting with file streams

Step 1: Create a file named input.txt and paste the following text into it. Create a file name output.txt and keep it empty.

GeeksForGeeks is awesome.

Step 2: Create a file index.js. and paste the following code into it.

Javascript




const fs = require("fs");
const fs = require("fs");
 
let rs = fs.createReadStream("./input.txt");
let ws = fs.createWriteStream("./output.txt");
 
function callback(msg) {
    console.log(msg);
}
 
// pipeReadToWrite() accepts two streams a
// readStream and s writeStream and a callback function.
function pipeReadToWrite(readStream, writeStream, callback) {
    // handles any error occurred in the stream
    function handleError(err) {
        // close the streams and call callback
        readStream.close();
        writeStream.close();
        callback(err);
    }
 
    readStream
        .on("error", handleError)  
        .pipe(writeStream)
        .on("error", handleError)
        .on("finish", callback);
}
 
pipeReadToWrite(rs, ws, callback);
let rs = fs.createReadStream("./input.txt");
let ws = fs.createWriteStream("./output.txt");
 
function callback(msg) {
    console.log(msg);
}
 
// pipeReadToWrite() accepts two streams a
// readStream and s writeStream and a callback function.
function pipeReadToWrite(readStream, writeStream, callback) {
// handles any error occurred in the stream
    function handleError(err) {
        // close the streams and call callback
        readStream.close();
        writeStream.close();
        callback(err);
    }
 
    readStream
                .on("error", handleError)
                .pipe(writeStream)
                .on("error", handleError)
                .on("finish", callback);
}
 
pipeReadToWrite(rs,ws,callback);


Code explanation:

  • Include the fs (file stream) module which is used to create and interact with file streams. Create a read stream and connect it to input.txt and a write stream and connect it to output.txt
  • pipeReadToWrite() function accepts a read stream and a write stream.
  • handleError() is a callback function that handles any error that occurs in reading or writing the stream.
  • execute the pipeReadToWrite() function with the required arguments.

Step 3: Execute the index.js file by writing the command in the command line

node index.js

Output: Open output.txt and you will see the following output displayed in it.

GeeksForGeeks is awesome.

Approach 2: Connecting more than two streams to create a complex workflow.

Project Structure: Our project structure would be:

  • index.js: main containing the code.
  • input.txt: file from which is to be compressed.

 

In this example, we connect multiple streams to compress a file.

The following modules are required in this approach.

  • fs: file stream module used for creating and interacting with file streams.
  • zlib: provides various methods to compress files.

Step 1: Create a file named input.txt and paste the following text into it. Create a file name output.txt and keep it empty.

GeeksForGeeks is awesome.

Step 2: Create a file index.js. and paste the following code into it.

Javascript




const fs = require("fs");
const zlib = require("zlib");
 
// gzip() function accepts a filename
// to be compressed and a callback function
function gzip(filename, callback) {
    // Create the streams
    let source = fs.createReadStream(filename);
    let destination = fs.createWriteStream(filename + ".gz");
    let gzipper = zlib.createGzip();
     
    // Set up the pipeline
    source
        .on("error", callback)
        .pipe(gzipper)
        .pipe(destination)
        .on("error", callback)
        .on("finish", callback);
}
 
gzip("./input.txt", (msg) => {
    console.log(msg);
});


Code explanation:

  • Include all the required modules
  • The function gzip() accepts a path to a file and a callback function. It compresses the file specified in the path and executes the callback function after successfully compressing the file.
  • Execute the gzip() function. 

Step 3: Execute the index.js file. Go to the command line and execute the command given below:

node index.js

Output: You will see a new file named input.txt.gz is created. This is the compressed version of input.txt.

input.txt.gz file is created



Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads