javascript / intermediate
Snippet
Safe Stream Processing with pipeline
In Node.js, manually piping streams can lead to memory leaks if errors are not handled on every segment. The 'pipeline' utility automatically destroys all streams in the chain if one fails or closes prematurely.
snippet.js
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
const { pipeline } = require('stream');const fs = require('fs');const zlib = require('zlib');pipeline(fs.createReadStream('archive.tar'),zlib.createGzip(),fs.createWriteStream('archive.tar.gz'),(err) => {if (err) {console.error('Pipeline failed:', err);} else {console.log('Pipeline completed successfully');}});
nodejs
Breakdown
1
pipeline(
Starts a managed stream chain that handles cleanup and error propagation.
2
(err) => {
A final callback that executes once when the entire pipeline is finished or encounters an error.