javascript / intermediate
Snippet
Memory-Efficient File Streams with pipeline
The pipeline function in Node.js provides a clean way to chain multiple streams together and automatically handles cleanup and error propagation. Unlike .pipe(), it returns a promise that resolves when the pipeline is complete, making it ideal for async/await workflows.
snippet.js
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
const { pipeline } = require('node:stream/promises');const { createReadStream, createWriteStream } = require('node:fs');const { createGzip } = require('node:zlib');async function compressFile(input, output) {try {await pipeline(createReadStream(input),createGzip(),createWriteStream(output));console.log('Compression successful');} catch (err) {console.error('Pipeline failed', err);}}
nodejs
Breakdown
1
const { pipeline } = require('node:stream/promises');
Imports the promise-based version of the pipeline utility.
2
await pipeline(...)
Combines a read stream, a transformation (Gzip), and a write stream into one process.