Imagine you're building a file-upload API that processes multi-GB files on the fly without exhausting RAM. You’d pipe the incoming data stream to a processing stream and write it to disk/database.
pipe() and manually handling data and end events?What happens here?
const fs = require('fs');
const stream = fs.createReadStream('hugeFile.txt');
stream.on('data', (chunk) => {
console.log('Chunk:', chunk.length);
});
stream.on('end', () => {
console.log('File read complete');
});
Follow-up: Now imagine console.log is slow (e.g. logging to remote server). What problem might occur?
Answer: Memory can build up — backpressure isn’t handled; console.log is the bottleneck.
Write a custom transform stream that converts all input to uppercase.