Node.js Streams: Processing Large Files Efficiently

Streams let you process large files without loading everything into memory. Essential for scalable Node.js applications.

Reading a File Stream

const fs = require('fs');

const readStream = fs.createReadStream('large-file.txt');

readStream.on('data', chunk => {
  console.log(`Received ${chunk.length} bytes`);
});

readStream.on('end', () => console.log('Done'));

Piping Streams

const zlib = require('zlib');

fs.createReadStream('input.txt')
  .pipe(zlib.createGzip())
  .pipe(fs.createWriteStream('output.txt.gz'));

Transform Stream

const { Transform } = require('stream');

const upperCase = new Transform({
  transform(chunk, encoding, callback) {
    this.push(chunk.toString().toUpperCase());
    callback();
  }
});

process.stdin.pipe(upperCase).pipe(process.stdout);

References


Discover more from C4: Container, Code, Cloud & Context

Subscribe to get the latest posts sent to your email.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.