Open In App

How to handle streaming data in Node ?

Last Updated : 05 Feb, 2024
Improve
Improve
Like Article
Like
Save
Share
Report

Streaming data in NodeJS involves processing data in chunks as it becomes available, rather than waiting for the entire dataset to be loaded into memory. This approach is particularly useful for handling large files, network data, or real-time data sources. NodeJS provides a powerful set of streaming APIs that allow developers to efficiently handle streaming data. Here’s how you can handle streaming data in NodeJS:

Reading from Streams:

NodeJS provides various readable streams, such as fs.createReadStream() for reading data from files, http.IncomingMessage for reading data from HTTP requests, and process.stdin for reading data from standard input.

const fs = require('fs');
const readStream = fs.createReadStream('input.txt');

readStream.on('data', (chunk) => {
console.log('Received data chunk:', chunk.toString());
});

readStream.on('end', () => {
console.log('Finished reading data.');
});

Writing to Streams:

Writable streams in NodeJS, such as fs.createWriteStream() for writing data to files, http.ServerResponse for writing data to HTTP responses, and process.stdout for writing data to standard output, allow you to write data chunk by chunk.

const fs = require('fs');
const writeStream = fs.createWriteStream('output.txt');

writeStream.write('Hello, ');
writeStream.write('World!');
writeStream.end(); // Close the stream

Transforming Streams:

Transform streams enable you to modify or transform data as it passes through the stream. You can create custom transform streams by extending the Transform class from the stream module and implementing the _transform() method.

const { Transform } = require('stream');

class UpperCaseTransform extends Transform {
_transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
}

const upperCaseTransform = new UpperCaseTransform();
process.stdin.pipe(upperCaseTransform).pipe(process.stdout);

Piping Streams:

Piping streams together is a common pattern in NodeJS, allowing you to connect the output of one stream to the input of another stream, creating a data pipeline.

const fs = require('fs');
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');

readStream.pipe(writeStream);

Conclusion:

NodeJS provides a versatile set of streaming APIs that enable efficient handling of streaming data. By using readable, writable, transform, and piping streams, developers can process large datasets, handle network communication, and perform real-time data processing tasks with ease. Leveraging streaming data processing capabilities is essential for building scalable, performant, and memory-efficient applications in NodeJS.


Like Article
Suggest improvement
Share your thoughts in the comments

Similar Reads