Open In App

How to do Piping/Streaming of JavaScript objects in Node.js ?

Last Updated : 03 Apr, 2023
Improve
Improve
Like Article
Like
Save
Share
Report

This article gives brief information about the piping/stream implementation in node.js and explains the working of various parts as buffers, streams and how to implement piping in node.js

Buffers: These are the temporary storage spot for a chunk of data that is being transferred from one place to another, the buffer is filled with data and then passed along. Buffer transfers small chunks of data at a time. Buffers are very useful as we don’t have to wait for the complete transfer process because we fill up the buffer which is the temporary storage spot for the data which can then be passed altogether.

Streams: They are the data handling method that is used for the transfer of the data. It increases the performance as the data is processed even before the complete transfer of the data. It collects the small chunks of buffer together and when the buffer is filled then it passes the data down the stream to be processed and sent to the client. They are a data-handling method and are used to read or write input into output following a proper sequence. Buffer and streams together while playing an online video you don’t need to wait for the complete video to be processed rather is small sections transferred with the help of buffers and streams bit by bit.

Readable Stream: Readable streams are used to extract information from any source that they are used to read the data that is present on any source, and then we can use that data to transfer to any source or to store in our system. Readable streams emit an event in the form of chunks (received from the buffers) that is a small part of the data that needs to be processed.

Example: Create a read.txt file in the current project directory with some sample text, in this case, we have the following text.

THIS IS READ.TXT FILE

Filename: index.js

Javascript




// Requiring file system module
const fs = require('fs');
 
// Creating a read stream
const myReadStream = fs.createReadStream('read.txt', 'utf8');
 
myReadStream.on('data', function (chunk) {
    console.log("new chunk received");
 
    // Printing the chunk
    console.log(chunk);
});


Output:

new chunk received 
THIS IS READ.TXT FILE

Writable Streams: Writable stream lets you send the data to any destination source that can rather be any file or stream online. These streams also emit the events to ensure proper functioning at regular intervals of time. 

Example: Create an index.js file with the following code.

Filename: index.js

Javascript




// Requiring file system module
const fs = require('fs');
 
// Creating a write stream
// It will create a txt file in provided location
const myWriteStream =
    fs.createWriteStream(__dirname + '/write.txt');
 
// Writing on stream
myWriteStream.write("Greetings from GeeksforGeeks");


Output: A new file named write.txt will be generated in the current directory which will contain the following data.

Greetings from GeeksforGeeks

Piping using pipeline():

Piping is a method of transferring data from one stream to other directly with the help of buffers so that the data transfer process can start as soon as it fills up a buffer. In other words, piping is used to process streamed data in various steps. This is a module method to pipe between streams forwarding errors and properly clean up and provide a callback when the pipeline is complete.

Example: Create a read.txt file in the current project directory with some sample text, in this case, we have the following text.

THIS IS READ.TXT FILE

Filename: index.js

Javascript




const fs = require('fs');
const zlib = require('zlib');
const { pipeline } = require('stream');
 
// Constructing promisify
const { promisify } = require('util');
const pipelineAsync = promisify(pipeline);
 
// Creating a read stream
const myReadStream = fs.createReadStream(
    __dirname + '/read.txt', 'utf8');
 
// Creating a write stream
const myWriteStream = fs.createWriteStream(
    __dirname + '/write.txt');
 
// Creating transform stream
const transform = zlib.createGzip();
 
(async function run() {
    try {
 
        // Pipelining three streams
        await pipelineAsync(
 
            // Reading from read stream
            myReadStream,
 
            // Transforming the file
            transform,
 
            // Writing on write stream
            myWriteStream
        );
        console.log("Pipeline accomplished");
    }
    catch (err) {
        console.error('pipeline failed with error:', err);
    }
})();


Output: We will see the following output and the write.txt file will be generated in the current directory.

Pipeline accomplished

Note: The pipeline should be used instead of a pipe, as the pipe is unsafe, but implementation on the pipe is given below:

Filename: index.js

Javascript




// Requiring file system module
const fs = require('fs');
 
// Creating a read stream
const myReadStream = fs.createReadStream(
    __dirname + '/read.txt', 'utf8');
 
// Creating a write stream
const myWriteStream = fs.createWriteStream(
    __dirname + '/write.txt');
 
// Piping using pipe
myReadStream.pipe(myWriteStream);
console.log("Piping accomplished");


Output: We will see the following output and the write.txt file will be generated in the current directory.

Piping accomplished


Like Article
Suggest improvement
Previous
Next
Share your thoughts in the comments

Similar Reads