Node.js
Advanced Node.js: Understanding Streams and Buffers
Deep dive into Node.js streams and buffers, understanding how to handle data efficiently in your applications.
Obinna Aguwa •

Streams and buffers are fundamental concepts in Node.js that allow you to handle reading/writing data efficiently. Let's explore these concepts in detail.
## Understanding Buffers
Buffers are used to handle binary data:
<pre><code class="language-javascript">
// Creating a buffer
const buf = Buffer.from('Hello World', 'utf8');
console.log(buf.toString()); // Hello World
// Working with buffers
const buf1 = Buffer.alloc(10);
buf1.write("Hello");
</code></pre>
## Working with Streams
Streams let you handle data in chunks:
<pre><code class="language-javascript">
const fs = require('fs');
const readStream = fs.createReadStream('input.txt');
const writeStream = fs.createWriteStream('output.txt');
readStream.pipe(writeStream);
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk);
});
</code></pre>
## Transform Streams
Transform streams modify data as it's being transferred:
<pre><code class="language-javascript">
const { Transform } = require('stream');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
callback(null, chunk.toString().toUpperCase());
}
});
</code></pre>
#nodejs
#javascript
#backend
#programming