- What is Node.js?
- What is the difference between Node.js and JavaScript?
- Is Node.js single-threaded?
- What is libuv and why is it used in Node.js?
- What kind of API function is supported by Node.js?
- What is a module in Node.js?
- What kind of API function is supported by Node.js?
- What is npm and its advantages?
- What is npm and its advantages?
- What is middleware?
- How does Node.js handle concurrency despite being single-threaded?
- What is the control flow in Node.js?
- What do you mean by the event loop in Node.js?
- What are the main disadvantages of Node.js?
- What is REPL in Node.js?
- How to import a module in Node.js?
- What is the difference between Node.js and AJAX?
- What is package.json in Node.js?
- What is the most popular Node.js framework used these days?
- What are promises in Node.js?
- What is event-driven programming in Node.js?
- What is buffer in Node.js?
- What are streams in Node.js?
- Explain crypto module in Node.js
- What is callback hell?
- Explain the use of the timers module in Node.js
- What is the difference between setImmediate() and process.nextTick() methods?
- What is the difference between setTimeout() and setImmediate() methods?
- What is the difference between spawn() and fork() methods?
- Explain the use of the passport module in Node.js
- What is a fork in Node.js?
- What are the three methods to avoid callback hell?
- What is body-parser in Node.js?
- What is CORS in Node.js?
- Explain the tls module in Node.js
- What is a cluster in Node.js?
- How to manage sessions in Node.js?
- Explain the types of streams in Node.js
- How can we implement authentication and authorization in Node.js?
- Explain the packages used for file uploading in Node.js
- How to handle database connections in Node.js?
- How to read command line arguments in Node.js?
- What are child processes in Node.js?
Node.js is an open-source, cross-platform JavaScript runtime environment built on Chrome's V8 JavaScript engine. It allows developers to execute JavaScript code outside of a web browser, primarily for building server-side applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient for building scalable network applications.
// Simple Node.js server example
const http = require('http');
const server = http.createServer((req, res) => {
res.writeHead(200, { 'Content-Type': 'text/plain' });
res.end('Hello World from Node.js!');
});
server.listen(3000, () => {
console.log('Server running on port 3000');
});JavaScript is a programming language that can run in browsers and other environments. Node.js is a runtime environment that allows JavaScript to run on the server side. The key differences include:
- JavaScript is the language itself, while Node.js is a platform/runtime for executing JavaScript
- Browser JavaScript has access to DOM and browser APIs, while Node.js has access to file system, networking, and OS-level APIs
- Node.js includes additional modules like
fs,http,path, etc., which aren't available in browser JavaScript - Browser JavaScript uses the Window object as global, while Node.js uses the
globalobject
// Browser JavaScript
console.log(window); // Available in browser
// Node.js
console.log(global); // Available in Node.js
const fs = require('fs'); // File system module only in Node.jsYes, Node.js is single-threaded in terms of its event loop and JavaScript execution. However, it uses multiple threads internally for handling asynchronous I/O operations through libuv (a C library). The JavaScript code runs on a single thread, but blocking operations like file I/O, network requests, and cryptographic operations are handled by a thread pool in the background.
// Single-threaded event loop handling multiple requests
const http = require('http');
const server = http.createServer((req, res) => {
// This runs on single thread
console.log('Request received on main thread');
// But async operations use thread pool
const fs = require('fs');
fs.readFile('file.txt', (err, data) => {
res.end(data);
});
});
server.listen(3000);libuv is a multi-platform C library that provides Node.js with asynchronous I/O capabilities. It's the core component that makes Node.js event-driven and non-blocking.
Why libuv is used:
- Cross-platform abstraction: Provides uniform interface for async operations across Windows, Linux, and macOS
- Event loop implementation: Powers Node.js's event-driven architecture
- Thread pool: Handles operations that can't be done asynchronously by the OS
- Non-blocking I/O: Enables handling multiple operations concurrently without blocking
How libuv works:
- Event Loop: Continuously checks for pending tasks and executes callbacks
- Thread Pool: Default size of 4 threads for CPU-intensive or blocking operations (file I/O, DNS lookup, crypto, compression)
- Handles different operation types:
- Network I/O (handled by OS kernel - epoll/kqueue/IOCP)
- File I/O (handled by thread pool)
- Timers, DNS operations, child processes
Architecture:
┌───────────────────────┐
│ JavaScript Code │
└───────────┬───────────┘
│
▼
┌───────────────────────┐
│ Node.js APIs │
└───────────┬───────────┘
│
▼
┌───────────────────────┐
│ libuv │
│ ┌─────────────────┐ │
│ │ Event Loop │ │
│ ├─────────────────┤ │
│ │ Thread Pool │ │
│ │ (4 threads) │ │
│ └─────────────────┘ │
└───────────┬───────────┘
│
▼
┌───────────────────────┐
│ Operating System │
│ (Kernel I/O APIs) │
└───────────────────────┘
Event Loop Phases (managed by libuv):
// Conceptual representation of event loop phases
┌───────────────────────────┐
┌─>│ timers │ // setTimeout, setInterval
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ pending callbacks │ // I/O callbacks deferred
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ idle, prepare │ // Internal use
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ poll │ // Retrieve new I/O events
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ check │ // setImmediate callbacks
│ └─────────────┬─────────────┘
│ ┌─────────────┴─────────────┐
│ │ close callbacks │ // socket.on('close', ...)
│ └─────────────┬─────────────┘
└──────────────┘
// Example showing libuv thread pool in action
const fs = require('fs');
const crypto = require('crypto');
// These operations use libuv's thread pool
console.time('file1');
fs.readFile('file1.txt', () => {
console.timeEnd('file1');
});
console.time('file2');
fs.readFile('file2.txt', () => {
console.timeEnd('file2');
});
console.time('file3');
fs.readFile('file3.txt', () => {
console.timeEnd('file3');
});
console.time('file4');
fs.readFile('file4.txt', () => {
console.timeEnd('file4');
});
console.time('file5');
fs.readFile('file5.txt', () => {
console.timeEnd('file5'); // May take longer (waits for thread)
});
// Crypto operations also use thread pool
console.time('crypto1');
crypto.pbkdf2('password', 'salt', 100000, 512, 'sha512', () => {
console.timeEnd('crypto1');
});
// Adjust thread pool size
process.env.UV_THREADPOOL_SIZE = 8; // Must be set before first async operation
// Network operations DON'T use thread pool (kernel handles them)
const http = require('http');
http.get('http://example.com', (res) => {
// This is non-blocking and doesn't consume thread pool
console.log('HTTP request completed');
});Key Operations Using Thread Pool:
- File system operations (
fs.readFile,fs.writeFile) - DNS lookups (
dns.lookup) - Crypto operations (
crypto.pbkdf2,crypto.randomBytes) - Compression (
zlib) - Some C++ add-ons
Key Operations NOT Using Thread Pool:
- Network I/O (TCP, UDP, HTTP)
- Timers
- Promises/microtasks
process.nextTick()
Interview Key Points:
- libuv enables Node.js to be non-blocking and event-driven
- It provides a thread pool (default 4 threads) for blocking operations
- Network I/O is handled by the OS kernel, not the thread pool
- The event loop is implemented by libuv with multiple phases
- You can adjust thread pool size with
UV_THREADPOOL_SIZEenvironment variable
Node.js supports two types of API functions:
Asynchronous (Non-blocking): These functions don't block the execution thread and use callbacks, promises, or async/await to handle results.
Synchronous (Blocking): These functions block the execution until the operation completes.
const fs = require('fs');
// Asynchronous (Non-blocking)
fs.readFile('file.txt', 'utf8', (err, data) => {
if (err) throw err;
console.log(data);
});
console.log('This runs before file is read');
// Synchronous (Blocking)
const data = fs.readFileSync('file.txt', 'utf8');
console.log(data);
console.log('This runs after file is read');Node.js supports two types of API functions:
- Core Modules: Built-in modules like
fs,http,path,os - Local Modules: Custom modules created by developers
- Third-party Modules: Modules installed via npm
// Creating a custom module (math.js)
function add(a, b) {
return a + b;
}
function subtract(a, b) {
return a - b;
}
module.exports = { add, subtract };
// Using the module (app.js)
const math = require('./math');
console.log(math.add(5, 3)); // 8
console.log(math.subtract(5, 3)); // 2npm (Node Package Manager) is the default package manager for Node.js. It helps developers install, manage, and share JavaScript packages.
Advantages:
- Provides access to thousands of reusable packages
- Simplifies dependency management through
package.json - Enables version control of packages
- Facilitates easy installation and updates of packages
- Supports both local and global package installation
- Offers scripts to automate tasks
# Install a package locally
npm install express
# Install a package globally
npm install -g nodemon
# Install all dependencies from package.json
npm install
# Run scripts defined in package.json
npm run startMiddleware functions are functions that have access to the request object (req), response object (res), and the next middleware function in the application's request-response cycle. Middleware can execute code, modify request/response objects, end the request-response cycle, or call the next middleware in the stack.
const express = require('express');
const app = express();
// Logger middleware
const logger = (req, res, next) => {
console.log(`${req.method} ${req.url}`);
next(); // Pass control to next middleware
};
// Authentication middleware
const authenticate = (req, res, next) => {
if (req.headers.authorization) {
next();
} else {
res.status(401).send('Unauthorized');
}
};
app.use(logger); // Apply to all routes
app.use(authenticate); // Apply to all routes
app.get('/data', (req, res) => {
res.json({ message: 'Protected data' });
});
app.listen(3000);Node.js handles concurrency through its event-driven, non-blocking I/O model and the event loop. When an asynchronous operation is initiated, Node.js delegates it to the system kernel or thread pool (via libuv) and continues executing other code. When the operation completes, a callback is queued and executed by the event loop.
const fs = require('fs');
console.log('Start');
// Non-blocking I/O operation
fs.readFile('file1.txt', (err, data) => {
console.log('File 1 read complete');
});
fs.readFile('file2.txt', (err, data) => {
console.log('File 2 read complete');
});
console.log('End');
// Output:
// Start
// End
// File 1 read complete (or File 2, depending on which finishes first)
// File 2 read completeControl flow refers to the order in which statements and function calls are executed. In Node.js, due to asynchronous operations, control flow can be complex. It involves managing the sequence of asynchronous operations using callbacks, promises, or async/await.
// Control flow with callbacks
function step1(callback) {
setTimeout(() => {
console.log('Step 1 complete');
callback();
}, 1000);
}
function step2(callback) {
setTimeout(() => {
console.log('Step 2 complete');
callback();
}, 1000);
}
// Sequential execution
step1(() => {
step2(() => {
console.log('All steps complete');
});
});
// Control flow with async/await
async function executeSteps() {
await step1Async();
await step2Async();
console.log('All steps complete');
}The event loop is the mechanism that handles asynchronous callbacks in Node.js. It continuously checks the call stack and callback queue. When the call stack is empty, it takes the first callback from the queue and pushes it to the call stack for execution. The event loop has multiple phases including timers, pending callbacks, poll, check, and close callbacks.
console.log('1');
setTimeout(() => {
console.log('2');
}, 0);
Promise.resolve().then(() => {
console.log('3');
});
console.log('4');
// Output: 1, 4, 3, 2
// Explanation:
// - Synchronous code (1, 4) executes first
// - Promises (microtasks) execute before timers (macrotasks)
// - setTimeout callback executes lastThe main disadvantages of Node.js include:
- Not suitable for CPU-intensive tasks: Single-threaded nature makes it poor for heavy computation
- Callback hell: Nested callbacks can make code difficult to read and maintain
- Immature tooling: Some tools and libraries are less mature compared to other platforms
- Unstable API: Frequent API changes in earlier versions (though more stable now)
- Asynchronous programming complexity: Requires understanding of async patterns
- Single-threaded limitations: Cannot utilize multiple CPU cores without clustering
// Example of CPU-intensive task blocking the event loop
const http = require('http');
const server = http.createServer((req, res) => {
// CPU-intensive calculation blocks other requests
let sum = 0;
for (let i = 0; i < 10000000000; i++) {
sum += i;
}
res.end(`Result: ${sum}`);
});
server.listen(3000);
// While processing one request, all others are blockedREPL stands for Read-Eval-Print-Loop. It's an interactive shell that processes Node.js expressions. It reads user input, evaluates the expression, prints the result, and loops back to read the next input. You can access it by typing node in the terminal without any file argument.
# Start REPL
$ node
> 2 + 2
4
> const arr = [1, 2, 3]
undefined
> arr.map(x => x * 2)
[2, 4, 6]
> .exit # Exit REPLREPL Commands:
.help- Show all commands.break- Exit multi-line expression.clear- Reset context.save filename- Save session to file.load filename- Load file into session
Modules can be imported using require() (CommonJS) or import (ES6 modules). Node.js traditionally uses CommonJS, but ES6 modules are now supported.
// CommonJS (require)
const fs = require('fs'); // Core module
const express = require('express'); // Third-party module
const myModule = require('./myModule'); // Local module
const { add, subtract } = require('./math'); // Destructuring
// ES6 Modules (import)
// Need "type": "module" in package.json or .mjs extension
import fs from 'fs';
import express from 'express';
import myModule from './myModule.js';
import { add, subtract } from './math.js';
// Dynamic import
async function loadModule() {
const module = await import('./myModule.js');
module.someFunction();
}Node.js and AJAX serve different purposes:
Node.js:
- Server-side runtime environment
- Used for building backend applications
- Executes JavaScript on the server
- Handles server logic, databases, file systems
AJAX (Asynchronous JavaScript and XML):
- Client-side technology
- Used for making asynchronous HTTP requests from browsers
- Executes in the browser
- Updates web pages without reloading
// Node.js - Server-side code
const http = require('http');
http.createServer((req, res) => {
res.end('Server response');
}).listen(3000);
// AJAX - Client-side code (using Fetch API)
fetch('http://localhost:3000/api/data')
.then(response => response.json())
.then(data => console.log(data))
.catch(error => console.error(error));package.json is a manifest file that contains metadata about a Node.js project. It includes project name, version, description, dependencies, scripts, author information, and more. It's essential for managing project dependencies and configuration.
{
"name": "my-app",
"version": "1.0.0",
"description": "A Node.js application",
"main": "index.js",
"scripts": {
"start": "node index.js",
"dev": "nodemon index.js",
"test": "jest"
},
"dependencies": {
"express": "^4.18.2",
"mongoose": "^7.0.0"
},
"devDependencies": {
"nodemon": "^2.0.20",
"jest": "^29.5.0"
},
"keywords": ["nodejs", "express"],
"author": "Your Name",
"license": "MIT"
}Key Fields:
dependencies: Packages required for productiondevDependencies: Packages needed only for developmentscripts: Custom commands that can be run withnpm run
Express.js is the most popular and widely used Node.js framework. It's a minimal and flexible web application framework that provides robust features for building web and mobile applications. Other popular frameworks include Koa, Nest.js, Fastify, and Hapi.
const express = require('express');
const app = express();
// Middleware
app.use(express.json());
// Routes
app.get('/', (req, res) => {
res.send('Hello World!');
});
app.post('/api/users', (req, res) => {
const user = req.body;
res.status(201).json({ message: 'User created', user });
});
// Error handling middleware
app.use((err, req, res, next) => {
console.error(err.stack);
res.status(500).send('Something broke!');
});
app.listen(3000, () => {
console.log('Server running on port 3000');
});Promises are objects representing the eventual completion or failure of an asynchronous operation. They provide a cleaner alternative to callbacks for handling asynchronous code. A promise can be in one of three states: pending, fulfilled, or rejected.
// Creating a promise
const myPromise = new Promise((resolve, reject) => {
setTimeout(() => {
const success = true;
if (success) {
resolve('Operation successful');
} else {
reject('Operation failed');
}
}, 1000);
});
// Using a promise
myPromise
.then(result => console.log(result))
.catch(error => console.error(error))
.finally(() => console.log('Promise completed'));
// Chaining promises
function fetchUser() {
return new Promise((resolve) => {
setTimeout(() => resolve({ id: 1, name: 'John' }), 1000);
});
}
function fetchOrders(userId) {
return new Promise((resolve) => {
setTimeout(() => resolve(['Order1', 'Order2']), 1000);
});
}
fetchUser()
.then(user => {
console.log('User:', user);
return fetchOrders(user.id);
})
.then(orders => console.log('Orders:', orders))
.catch(error => console.error(error));Event-driven programming is a paradigm where the flow of the program is determined by events such as user actions, sensor outputs, or messages from other programs. Node.js uses an event-driven architecture where certain objects (emitters) emit events, and listeners respond to those events.
const EventEmitter = require('events');
// Create an event emitter
class MyEmitter extends EventEmitter {}
const myEmitter = new MyEmitter();
// Register event listeners
myEmitter.on('event', (data) => {
console.log('Event occurred with data:', data);
});
myEmitter.on('error', (err) => {
console.error('Error occurred:', err);
});
// Emit events
myEmitter.emit('event', { message: 'Hello World' });
myEmitter.emit('error', new Error('Something went wrong'));
// Real-world example with HTTP server
const http = require('http');
const server = http.createServer();
server.on('request', (req, res) => {
console.log('Request received');
res.end('Response sent');
});
server.on('connection', (socket) => {
console.log('New connection established');
});
server.listen(3000);Buffer is a temporary storage area for binary data. It's used to handle binary data directly, especially when dealing with streams, file operations, or network communications. Buffers are instances of the Buffer class and represent fixed-size chunks of memory allocated outside the V8 heap.
// Creating buffers
const buf1 = Buffer.from('Hello World', 'utf8');
const buf2 = Buffer.alloc(10); // Creates a 10-byte buffer filled with zeros
const buf3 = Buffer.allocUnsafe(10); // Faster but contains old data
// Reading from buffer
console.log(buf1.toString()); // 'Hello World'
console.log(buf1.toString('hex')); // Hexadecimal representation
// Writing to buffer
buf2.write('Node.js');
console.log(buf2.toString()); // 'Node.js'
// Buffer operations
const buf4 = Buffer.from([1, 2, 3, 4]);
console.log(buf4.length); // 4
console.log(buf4[0]); // 1
// Concatenating buffers
const buf5 = Buffer.concat([buf1, buf2]);
console.log(buf5.toString());
// Compare buffers
const result = buf1.compare(buf2);
console.log(result); // -1, 0, or 1Streams are objects that enable reading or writing data continuously in chunks rather than loading everything into memory at once. This makes them memory-efficient for handling large amounts of data. There are four types of streams: Readable, Writable, Duplex, and Transform.
const fs = require('fs');
// Readable stream
const readStream = fs.createReadStream('input.txt', 'utf8');
readStream.on('data', (chunk) => {
console.log('Received chunk:', chunk);
});
readStream.on('end', () => {
console.log('Reading completed');
});
readStream.on('error', (err) => {
console.error('Error:', err);
});
// Writable stream
const writeStream = fs.createWriteStream('output.txt');
writeStream.write('Hello ');
writeStream.write('World!');
writeStream.end();
// Piping streams (efficient copy)
const input = fs.createReadStream('source.txt');
const output = fs.createWriteStream('destination.txt');
input.pipe(output);
// Transform stream example
const { Transform } = require('stream');
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
process.stdin.pipe(upperCaseTransform).pipe(process.stdout);The crypto module provides cryptographic functionality including encryption, decryption, hashing, and digital signatures. It supports various algorithms like SHA, MD5, AES, RSA, etc.
const crypto = require('crypto');
// Hashing
const hash = crypto.createHash('sha256');
hash.update('password123');
const hashedPassword = hash.digest('hex');
console.log('Hashed:', hashedPassword);
// HMAC (Hash-based Message Authentication Code)
const hmac = crypto.createHmac('sha256', 'secret-key');
hmac.update('data to sign');
const signature = hmac.digest('hex');
console.log('Signature:', signature);
// Encryption and Decryption
const algorithm = 'aes-256-cbc';
const key = crypto.randomBytes(32);
const iv = crypto.randomBytes(16);
// Encrypt
function encrypt(text) {
const cipher = crypto.createCipheriv(algorithm, key, iv);
let encrypted = cipher.update(text, 'utf8', 'hex');
encrypted += cipher.final('hex');
return encrypted;
}
// Decrypt
function decrypt(encrypted) {
const decipher = crypto.createDecipheriv(algorithm, key, iv);
let decrypted = decipher.update(encrypted, 'hex', 'utf8');
decrypted += decipher.final('utf8');
return decrypted;
}
const encrypted = encrypt('Secret message');
console.log('Encrypted:', encrypted);
console.log('Decrypted:', decrypt(encrypted));
// Generate random bytes
const randomBytes = crypto.randomBytes(16).toString('hex');
console.log('Random token:', randomBytes);Callback hell, also known as "pyramid of doom," occurs when multiple nested callbacks make code difficult to read and maintain. It happens when dealing with multiple asynchronous operations that depend on each other.
// Example of callback hell
fs.readFile('file1.txt', (err, data1) => {
if (err) throw err;
fs.readFile('file2.txt', (err, data2) => {
if (err) throw err;
fs.readFile('file3.txt', (err, data3) => {
if (err) throw err;
fs.writeFile('output.txt', data1 + data2 + data3, (err) => {
if (err) throw err;
console.log('Files combined successfully');
});
});
});
});
// Solution using Promises
const readFilePromise = (file) => {
return new Promise((resolve, reject) => {
fs.readFile(file, (err, data) => {
if (err) reject(err);
else resolve(data);
});
});
};
Promise.all([
readFilePromise('file1.txt'),
readFilePromise('file2.txt'),
readFilePromise('file3.txt')
])
.then(([data1, data2, data3]) => {
return fs.promises.writeFile('output.txt', data1 + data2 + data3);
})
.then(() => console.log('Files combined successfully'))
.catch(err => console.error(err));
// Solution using async/await
async function combineFiles() {
try {
const data1 = await fs.promises.readFile('file1.txt');
const data2 = await fs.promises.readFile('file2.txt');
const data3 = await fs.promises.readFile('file3.txt');
await fs.promises.writeFile('output.txt', data1 + data2 + data3);
console.log('Files combined successfully');
} catch (err) {
console.error(err);
}
}The timers module provides functions to execute code after a certain period or at regular intervals. It includes setTimeout(), setInterval(), setImmediate(), and their clear counterparts.
// setTimeout - Execute once after delay
const timeoutId = setTimeout(() => {
console.log('Executed after 2 seconds');
}, 2000);
// Clear timeout
clearTimeout(timeoutId);
// setInterval - Execute repeatedly at intervals
const intervalId = setInterval(() => {
console.log('Executed every 1 second');
}, 1000);
// Clear interval after 5 seconds
setTimeout(() => {
clearInterval(intervalId);
console.log('Interval cleared');
}, 5000);
// setImmediate - Execute on next event loop iteration
setImmediate(() => {
console.log('Executed immediately after I/O events');
});
// Example: Countdown timer
let counter = 10;
const countdown = setInterval(() => {
console.log(counter);
counter--;
if (counter < 0) {
clearInterval(countdown);
console.log('Countdown finished!');
}
}, 1000);Both execute callbacks asynchronously, but they have different priorities in the event loop.
process.nextTick():
- Executes callback before the event loop continues
- Has higher priority than I/O events
- Can cause I/O starvation if used recursively
setImmediate():
- Executes callback in the next iteration of the event loop
- Runs after I/O events
- Better for preventing event loop blocking
console.log('1');
setImmediate(() => {
console.log('2: setImmediate');
});
process.nextTick(() => {
console.log('3: nextTick');
});
console.log('4');
// Output: 1, 4, 3: nextTick, 2: setImmediate
// Example showing priority
process.nextTick(() => console.log('nextTick 1'));
process.nextTick(() => console.log('nextTick 2'));
setImmediate(() => console.log('setImmediate 1'));
setImmediate(() => console.log('setImmediate 2'));
setTimeout(() => console.log('setTimeout'), 0);
// Output:
// nextTick 1
// nextTick 2
// setTimeout
// setImmediate 1
// setImmediate 2
// Recursive nextTick can block I/O
process.nextTick(function recursive() {
console.log('This blocks the event loop');
process.nextTick(recursive); // Dangerous!
});Both schedule callbacks for later execution, but their timing differs:
setTimeout():
- Schedules callback after minimum delay (in milliseconds)
- Processed in the timers phase of event loop
setTimeout(fn, 0)has at least 1ms delay
setImmediate():
- Executes in the check phase of event loop
- Runs after I/O operations
- More predictable for I/O-bound operations
// Timing comparison
console.log('Start');
setTimeout(() => {
console.log('setTimeout');
}, 0);
setImmediate(() => {
console.log('setImmediate');
});
console.log('End');
// Output order can vary:
// Start, End, setTimeout, setImmediate
// OR
// Start, End, setImmediate, setTimeout
// Inside I/O cycle, setImmediate is always first
const fs = require('fs');
fs.readFile(__filename, () => {
setTimeout(() => {
console.log('setTimeout in I/O');
}, 0);
setImmediate(() => {
console.log('setImmediate in I/O');
});
});
// Output (consistent):
// setImmediate in I/O
// setTimeout in I/OBoth are methods of the child_process module for creating child processes, but they have different use cases:
spawn():
- Creates a new process to execute a command
- Returns a stream-based interface
- Suitable for large data operations
- Can execute any system command
- More memory efficient
fork():
- Special case of spawn() for Node.js processes
- Creates a new V8 instance
- Built-in IPC (Inter-Process Communication) channel
- Used specifically for running Node.js scripts
- Returns a child process object with messaging capabilities
const { spawn, fork } = require('child_process');
// spawn() - Execute system command
const ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ls.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ls.on('close', (code) => {
console.log(`Process exited with code ${code}`);
});
// fork() - Run Node.js script with IPC
// child.js file:
// process.on('message', (msg) => {
// console.log('Message from parent:', msg);
// process.send({ result: msg * 2 });
// });
const child = fork('child.js');
child.on('message', (msg) => {
console.log('Message from child:', msg);
});
child.send(10);Passport is an authentication middleware for Node.js. It's flexible, modular, and supports various authentication strategies including local authentication, OAuth, JWT, and social logins (Google, Facebook, Twitter, etc.).
const express = require('express');
const passport = require('passport');
const LocalStrategy = require('passport-local').Strategy;
const session = require('express-session');
const app = express();
// Middleware
app.use(express.json());
app.use(express.urlencoded({ extended: false }));
app.use(session({
secret: 'secret-key',
resave: false,
saveUninitialized: false
}));
app.use(passport.initialize());
app.use(passport.session());
// Configure Local Strategy
passport.use(new LocalStrategy(
(username, password, done) => {
// Validate user credentials (usually from database)
if (username === 'admin' && password === 'password') {
return done(null, { id: 1, username: 'admin' });
} else {
return done(null, false, { message: 'Invalid credentials' });
}
}
));
// Serialize user for session
passport.serializeUser((user, done) => {
done(null, user.id);
});
// Deserialize user from session
passport.deserializeUser((id, done) => {
// Fetch user from database by id
done(null, { id: 1, username: 'admin' });
});
// Login route
app.post('/login',
passport.authenticate('local', {
successRedirect: '/dashboard',
failureRedirect: '/login'
})
);
// Protected route
app.get('/dashboard', (req, res) => {
if (req.isAuthenticated()) {
res.send('Welcome to dashboard');
} else {
res.redirect('/login');
}
});
// Logout
app.get('/logout', (req, res) => {
req.logout((err) => {
if (err) return next(err);
res.redirect('/');
});
});
app.listen(3000);A fork in Node.js refers to creating a new child process that runs a separate Node.js instance. It's created using the fork() method from the child_process module. Forking is useful for running CPU-intensive tasks without blocking the main event loop, enabling parallel processing, and implementing worker pools.
// parent.js
const { fork } = require('child_process');
console.log('Parent process started');
// Fork a child process
const child = fork('./worker.js');
// Send data to child
child.send({ numbers: [1, 2, 3, 4, 5] });
// Receive data from child
child.on('message', (msg) => {
console.log('Result from child:', msg);
});
child.on('exit', (code) => {
console.log(`Child process exited with code ${code}`);
});
// worker.js
process.on('message', (msg) => {
console.log('Received in child:', msg);
// Perform heavy computation
const sum = msg.numbers.reduce((a, b) => a + b, 0);
// Send result back to parent
process.send({ sum });
// Exit child process
process.exit(0);
});
// Example: CPU-intensive task
// parent.js
const child = fork('./heavy-task.js');
child.send({ start: 0, end: 1000000 });
child.on('message', (result) => {
console.log('Computation result:', result);
});The three main methods to avoid callback hell are:
- Promises: Chain asynchronous operations using
.then()and.catch() - Async/Await: Write asynchronous code in a synchronous style
- Modularization: Break code into smaller, reusable functions
// 1. Using Promises
function readFilePromise(file) {
return new Promise((resolve, reject) => {
fs.readFile(file, 'utf8', (err, data) => {
if (err) reject(err);
else resolve(data);
});
});
}
readFilePromise('file1.txt')
.then(data1 => {
console.log(data1);
return readFilePromise('file2.txt');
})
.then(data2 => {
console.log(data2);
return readFilePromise('file3.txt');
})
.then(data3 => {
console.log(data3);
})
.catch(err => console.error(err));
// 2. Using Async/Await
async function readFiles() {
try {
const data1 = await readFilePromise('file1.txt');
console.log(data1);
const data2 = await readFilePromise('file2.txt');
console.log(data2);
const data3 = await readFilePromise('file3.txt');
console.log(data3);
} catch (err) {
console.error(err);
}
}
readFiles();
// 3. Modularization
function readFile1(callback) {
fs.readFile('file1.txt', callback);
}
function readFile2(callback) {
fs.readFile('file2.txt', callback);
}
function processFiles() {
readFile1((err, data1) => {
if (err) return console.error(err);
console.log(data1);
readFile2((err, data2) => {
if (err) return console.error(err);
console.log(data2);
});
});
}
// Better: Use named functions
function handleFile1(err, data) {
if (err) return console.error(err);
console.log(data);
readFile2(handleFile2);
}
function handleFile2(err, data) {
if (err) return console.error(err);
console.log(data);
}
readFile1(handleFile1);Body-parser is a middleware that parses incoming request bodies before handlers, making the data available under req.body. It supports JSON, URL-encoded, raw, and text data. Note that Express 4.16+ includes body-parser functionality built-in.
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
// Parse JSON bodies
app.use(bodyParser.json());
// Parse URL-encoded bodies
app.use(bodyParser.urlencoded({ extended: true }));
// Parse raw bodies
app.use(bodyParser.raw());
// Parse text bodies
app.use(bodyParser.text());
// Routes
app.post('/user', (req, res) => {
console.log(req.body); // Access parsed body
res.json({
message: 'User created',
data: req.body
});
});
// Modern Express (built-in body parsing)
const express = require('express');
const app = express();
// Built-in middleware (Express 4.16+)
app.use(express.json());
app.use(express.urlencoded({ extended: true }));
app.post('/api/data', (req, res) => {
const { name, email } = req.body;
res.json({ message: 'Data received', name, email });
});
// Handle different content types
app.post('/upload',
express.raw({ type: 'application/octet-stream', limit: '10mb' }),
(req, res) => {
console.log('Raw data:', req.body);
res.send('File received');
}
);
app.listen(3000);CORS (Cross-Origin Resource Sharing) is a security mechanism that allows or restricts resources on a web server to be requested from another domain. By default, browsers block cross-origin requests for security reasons. The cors package in Node.js enables CORS with various configuration options.
const express = require('express');
const cors = require('cors');
const app = express();
// Enable CORS for all routes and origins
app.use(cors());
// Enable CORS with specific options
app.use(cors({
origin: 'https://example.com', // Allow specific origin
methods: ['GET', 'POST', 'PUT', 'DELETE'], // Allowed methods
allowedHeaders: ['Content-Type', 'Authorization'], // Allowed headers
credentials: true, // Allow cookies
maxAge: 3600 // Cache preflight request for 1 hour
}));
// Enable CORS for specific routes
app.get('/public', cors(), (req, res) => {
res.json({ message: 'CORS enabled for this route' });
});
// Dynamic origin
const corsOptions = {
origin: function (origin, callback) {
const whitelist = ['https://example1.com', 'https://example2.com'];
if (whitelist.indexOf(origin) !== -1 || !origin) {
callback(null, true);
} else {
callback(new Error('Not allowed by CORS'));
}
}
};
app.use(cors(corsOptions));
// Manual CORS implementation
app.use((req, res, next) => {
res.header('Access-Control-Allow-Origin', '*');
res.header('Access-Control-Allow-Methods', 'GET, POST, PUT, DELETE');
res.header('Access-Control-Allow-Headers', 'Content-Type, Authorization');
if (req.method === 'OPTIONS') {
return res.sendStatus(200);
}
next();
});
app.listen(3000);The TLS (Transport Layer Security) module provides an implementation of TLS and SSL protocols for secure communication. It's used to create secure servers and clients with encrypted connections.
const tls = require('tls');
const fs = require('fs');
// TLS Server
const options = {
key: fs.readFileSync('server-key.pem'),
cert: fs.readFileSync('server-cert.pem'),
// Optional: Require client certificate
requestCert: true,
rejectUnauthorized: false,
ca: [fs.readFileSync('client-cert.pem')]
};
const server = tls.createServer(options, (socket) => {
console.log('Client connected');
console.log('Authorized:', socket.authorized);
socket.write('Welcome to secure server\n');
socket.on('data', (data) => {
console.log('Received:', data.toString());
socket.write('Echo: ' + data);
});
socket.on('end', () => {
console.log('Client disconnected');
});
});
server.listen(8000, () => {
console.log('TLS server listening on port 8000');
});
// TLS Client
const clientOptions = {
host: 'localhost',
port: 8000,
// Optional: Client certificate
key: fs.readFileSync('client-key.pem'),
cert: fs.readFileSync('client-cert.pem'),
// Server certificate validation
ca: [fs.readFileSync('server-cert.pem')],
rejectUnauthorized: true
};
const client = tls.connect(clientOptions, () => {
console.log('Connected to server');
console.log('Authorized:', client.authorized);
client.write('Hello from client');
});
client.on('data', (data) => {
console.log('Server response:', data.toString());
});
client.on('end', () => {
console.log('Disconnected from server');
});
// HTTPS Server using TLS
const https = require('https');
const httpsOptions = {
key: fs.readFileSync('server-key.pem'),
cert: fs.readFileSync('server-cert.pem')
};
https.createServer(httpsOptions, (req, res) => {
res.writeHead(200);
res.end('Secure HTTPS server');
}).listen(443);The cluster module allows you to create child processes (workers) that share the same server port. It enables Node.js applications to take advantage of multi-core systems by distributing the workload across multiple CPU cores, improving performance and reliability.
const cluster = require('cluster');
const http = require('http');
const os = require('os');
const numCPUs = os.cpus().length;
if (cluster.isMaster) {
console.log(`Master process ${process.pid} is running`);
// Fork workers for each CPU core
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
// Handle worker exit
cluster.on('exit', (worker, code, signal) => {
console.log(`Worker ${worker.process.pid} died`);
console.log('Starting a new worker');
cluster.fork(); // Restart worker
});
// Listen for messages from workers
cluster.on('message', (worker, message) => {
console.log(`Message from worker ${worker.id}:`, message);
});
} else {
// Workers can share any TCP connection
const server = http.createServer((req, res) => {
res.writeHead(200);
res.end(`Response from worker ${process.pid}\n`);
});
server.listen(3000, () => {
console.log(`Worker ${process.pid} started`);
});
// Send message to master
process.send({ workerId: cluster.worker.id, status: 'ready' });
}
// Advanced: Using PM2 for clustering
// pm2 start app.js -i max // max = number of CPUs
// Example with Express
const express = require('express');
if (cluster.isMaster) {
for (let i = 0; i < numCPUs; i++) {
cluster.fork();
}
} else {
const app = express();
app.get('/', (req, res) => {
res.send(`Worker ${process.pid} handling request`);
});
app.listen(3000);
}Sessions are managed using middleware like express-session that stores session data on the server and sends a session ID to the client via cookies. Session data can be stored in memory, databases (Redis, MongoDB), or files.
const express = require('express');
const session = require('express-session');
const RedisStore = require('connect-redis').default;
const { createClient } = require('redis');
const app = express();
// Basic session configuration (memory store)
app.use(session({
secret: 'your-secret-key',
resave: false,
saveUninitialized: false,
cookie: {
secure: false, // Set to true in production with HTTPS
httpOnly: true,
maxAge: 24 * 60 * 60 * 1000 // 24 hours
}
}));
// Using Redis for session storage (production)
const redisClient = createClient({
host: 'localhost',
port: 6379
});
redisClient.connect().catch(console.error);
app.use(session({
store: new RedisStore({ client: redisClient }),
secret: 'your-secret-key',
resave: false,
saveUninitialized: false,
cookie: {
secure: true,
httpOnly: true,
maxAge: 24 * 60 * 60 * 1000
}
}));
// Session usage in routes
app.post('/login', (req, res) => {
const { username, password } = req.body;
// Validate credentials
if (username === 'admin' && password === 'password') {
// Store user data in session
req.session.userId = 1;
req.session.username = username;
req.session.isAuthenticated = true;
res.json({ message: 'Login successful' });
} else {
res.status(401).json({ message: 'Invalid credentials' });
}
});
// Access session data
app.get('/profile', (req, res) => {
if (req.session.isAuthenticated) {
res.json({
username: req.session.username,
userId: req.session.userId
});
} else {
res.status(401).json({ message: 'Not authenticated' });
}
});
// Destroy session
app.post('/logout', (req, res) => {
req.session.destroy((err) => {
if (err) {
return res.status(500).json({ message: 'Logout failed' });
}
res.clearCookie('connect.sid');
res.json({ message: 'Logged out successfully' });
});
});
// Session middleware for protected routes
function requireAuth(req, res, next) {
if (req.session && req.session.isAuthenticated) {
next();
} else {
res.status(401).json({ message: 'Unauthorized' });
}
}
app.get('/dashboard', requireAuth, (req, res) => {
res.json({ message: 'Welcome to dashboard' });
});
app.listen(3000);Node.js has four fundamental types of streams:
- Readable Streams: Used for reading data (e.g.,
fs.createReadStream) - Writable Streams: Used for writing data (e.g.,
fs.createWriteStream) - Duplex Streams: Both readable and writable (e.g., TCP sockets)
- Transform Streams: Duplex streams that modify data as it's read or written (e.g., compression, encryption)
const fs = require('fs');
const { Readable, Writable, Duplex, Transform } = require('stream');
// 1. Readable Stream
const readableStream = fs.createReadStream('input.txt', 'utf8');
readableStream.on('data', (chunk) => {
console.log('Chunk:', chunk);
});
readableStream.on('end', () => {
console.log('Reading complete');
});
// Custom Readable Stream
const customReadable = new Readable({
read(size) {
this.push('Hello ');
this.push('World!');
this.push(null); // Signal end of stream
}
});
customReadable.pipe(process.stdout);
// 2. Writable Stream
const writableStream = fs.createWriteStream('output.txt');
writableStream.write('Line 1\n');
writableStream.write('Line 2\n');
writableStream.end('Final line\n');
// Custom Writable Stream
const customWritable = new Writable({
write(chunk, encoding, callback) {
console.log('Writing:', chunk.toString());
callback();
}
});
customReadable.pipe(customWritable);
// 3. Duplex Stream (TCP socket example)
const net = require('net');
const server = net.createServer((socket) => {
// Socket is a duplex stream
socket.on('data', (data) => {
console.log('Received:', data.toString());
socket.write('Echo: ' + data); // Write back
});
});
server.listen(8000);
// Custom Duplex Stream
const customDuplex = new Duplex({
read(size) {
this.push('Reading data');
this.push(null);
},
write(chunk, encoding, callback) {
console.log('Writing:', chunk.toString());
callback();
}
});
// 4. Transform Stream
const upperCaseTransform = new Transform({
transform(chunk, encoding, callback) {
this.push(chunk.toString().toUpperCase());
callback();
}
});
// Chain streams
fs.createReadStream('input.txt')
.pipe(upperCaseTransform)
.pipe(fs.createWriteStream('output.txt'));
// Compression example
const zlib = require('zlib');
fs.createReadStream('file.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('file.txt.gz'));
// Decompression
fs.createReadStream('file.txt.gz')
.pipe(zlib.createGunzip())
.pipe(fs.createWriteStream('file-decompressed.txt'));Authentication verifies user identity, while authorization determines user permissions. Common approaches include JWT (JSON Web Tokens), sessions, OAuth, and Passport.js.
const express = require('express');
const jwt = require('jsonwebtoken');
const bcrypt = require('bcrypt');
const app = express();
app.use(express.json());
const SECRET_KEY = 'your-secret-key';
const users = []; // In production, use a database
// User Registration
app.post('/register', async (req, res) => {
try {
const { username, password, role } = req.body;
// Hash password
const hashedPassword = await bcrypt.hash(password, 10);
const user = {
id: users.length + 1,
username,
password: hashedPassword,
role: role || 'user' // user, admin, moderator
};
users.push(user);
res.status(201).json({ message: 'User registered successfully' });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
// User Login (Authentication)
app.post('/login', async (req, res) => {
try {
const { username, password } = req.body;
// Find user
const user = users.find(u => u.username === username);
if (!user) {
return res.status(401).json({ message: 'Invalid credentials' });
}
// Verify password
const validPassword = await bcrypt.compare(password, user.password);
if (!validPassword) {
return res.status(401).json({ message: 'Invalid credentials' });
}
// Generate JWT token
const token = jwt.sign(
{ id: user.id, username: user.username, role: user.role },
SECRET_KEY,
{ expiresIn: '24h' }
);
res.json({ token, message: 'Login successful' });
} catch (error) {
res.status(500).json({ error: error.message });
}
});
// Authentication Middleware
function authenticateToken(req, res, next) {
const authHeader = req.headers['authorization'];
const token = authHeader && authHeader.split(' ')[1]; // Bearer TOKEN
if (!token) {
return res.status(401).json({ message: 'Access token required' });
}
jwt.verify(token, SECRET_KEY, (err, user) => {
if (err) {
return res.status(403).json({ message: 'Invalid or expired token' });
}
req.user = user;
next();
});
}
// Authorization Middleware
function authorize(...roles) {
return (req, res, next) => {
if (!req.user) {
return res.status(401).json({ message: 'Not authenticated' });
}
if (!roles.includes(req.user.role)) {
return res.status(403).json({ message: 'Insufficient permissions' });
}
next();
};
}
// Protected Routes
app.get('/profile', authenticateToken, (req, res) => {
res.json({ user: req.user, message: 'Access granted' });
});
// Admin-only route
app.get('/admin', authenticateToken, authorize('admin'), (req, res) => {
res.json({ message: 'Admin area', users });
});
// Multiple roles
app.get('/moderator', authenticateToken, authorize('admin', 'moderator'), (req, res) => {
res.json({ message: 'Moderator area' });
});
// Refresh token implementation
const refreshTokens = [];
app.post('/refresh-token', (req, res) => {
const { refreshToken } = req.body;
if (!refreshToken || !refreshTokens.includes(refreshToken)) {
return res.status(403).json({ message: 'Invalid refresh token' });
}
jwt.verify(refreshToken, SECRET_KEY, (err, user) => {
if (err) {
return res.status(403).json({ message: 'Invalid refresh token' });
}
const newToken = jwt.sign(
{ id: user.id, username: user.username, role: user.role },
SECRET_KEY,
{ expiresIn: '15m' }
);
res.json({ token: newToken });
});
});
app.listen(3000);Common packages for file uploading include:
- Multer: Most popular, handles
multipart/form-data - Formidable: Robust parsing of form data
- Busboy: Low-level multipart form parser
- Express-fileupload: Simple middleware for Express
const express = require('express');
const multer = require('multer');
const path = require('path');
const app = express();
// 1. MULTER Configuration
// Storage configuration
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, 'uploads/');
},
filename: function (req, file, cb) {
const uniqueSuffix = Date.now() + '-' + Math.round(Math.random() * 1E9);
cb(null, file.fieldname + '-' + uniqueSuffix + path.extname(file.originalname));
}
});
// File filter
const fileFilter = (req, file, cb) => {
const allowedTypes = ['image/jpeg', 'image/png', 'image/gif', 'application/pdf'];
if (allowedTypes.includes(file.mimetype)) {
cb(null, true);
} else {
cb(new Error('Invalid file type'), false);
}
};
// Multer instance
const upload = multer({
storage: storage,
limits: {
fileSize: 5 * 1024 * 1024 // 5MB limit
},
fileFilter: fileFilter
});
// Single file upload
app.post('/upload/single', upload.single('file'), (req, res) => {
if (!req.file) {
return res.status(400).json({ message: 'No file uploaded' });
}
res.json({
message: 'File uploaded successfully',
file: {
filename: req.file.filename,
originalname: req.file.originalname,
size: req.file.size,
path: req.file.path
}
});
});
// Multiple files upload
app.post('/upload/multiple', upload.array('files', 5), (req, res) => {
if (!req.files || req.files.length === 0) {
return res.status(400).json({ message: 'No files uploaded' });
}
const fileDetails = req.files.map(file => ({
filename: file.filename,
originalname: file.originalname,
size: file.size
}));
res.json({
message: 'Files uploaded successfully',
files: fileDetails
});
});
// Multiple fields
app.post('/upload/fields',
upload.fields([
{ name: 'avatar', maxCount: 1 },
{ name: 'gallery', maxCount: 5 }
]),
(req, res) => {
res.json({
message: 'Files uploaded',
avatar: req.files['avatar'],
gallery: req.files['gallery']
});
}
);
// Memory storage (for cloud uploads)
const memoryUpload = multer({ storage: multer.memoryStorage() });
app.post('/upload/memory', memoryUpload.single('file'), (req, res) => {
// req.file.buffer contains file data
console.log('File buffer size:', req.file.buffer.length);
// Upload to cloud storage (AWS S3, Google Cloud Storage, etc.)
res.json({ message: 'File received in memory' });
});
// 2. FORMIDABLE Example
const formidable = require('formidable');
app.post('/upload/formidable', (req, res) => {
const form = formidable({
uploadDir: './uploads',
keepExtensions: true,
maxFileSize: 5 * 1024 * 1024
});
form.parse(req, (err, fields, files) => {
if (err) {
return res.status(400).json({ error: err.message });
}
res.json({
fields,
files
});
});
});
// Error handling middleware
app.use((err, req, res, next) => {
if (err instanceof multer.MulterError) {
if (err.code === 'LIMIT_FILE_SIZE') {
return res.status(400).json({ message: 'File too large' });
}
}
res.status(500).json({ error: err.message });
});
app.listen(3000);Database connections can be managed using connection pooling, ORM/ODM libraries, and proper error handling. Common databases include MongoDB, MySQL, PostgreSQL, and Redis.
// 1. MONGODB with Mongoose
const mongoose = require('mongoose');
// Single connection
mongoose.connect('mongodb://localhost:27017/mydb', {
useNewUrlParser: true,
useUnifiedTopology: true
});
const db = mongoose.connection;
db.on('error', console.error.bind(console, 'MongoDB connection error:'));
db.once('open', () => {
console.log('Connected to MongoDB');
});
// Define schema and model
const userSchema = new mongoose.Schema({
name: String,
email: { type: String, unique: true },
age: Number,
createdAt: { type: Date, default: Date.now }
});
const User = mongoose.model('User', userSchema);
// CRUD operations
async function createUser() {
const user = new User({
name: 'John Doe',
email: 'john@example.com',
age: 30
});
await user.save();
}
async function findUsers() {
const users = await User.find({ age: { $gte: 18 } });
return users;
}
// 2. POSTGRESQL with pg
const { Pool } = require('pg');
const pool = new Pool({
user: 'postgres',
host: 'localhost',
database: 'mydb',
password: 'password',
port: 5432,
max: 20, // Maximum number of clients in pool
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000
});
// Query with pool
async function queryDatabase() {
const client = await pool.connect();
try {
const result = await client.query('SELECT * FROM users WHERE age > $1', [18]);
return result.rows;
} finally {
client.release();
}
}
// Transaction example
async function transferMoney(fromId, toId, amount) {
const client = await pool.connect();
try {
await client.query('BEGIN');
await client.query(
'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
[amount, fromId]
);
await client.query(
'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
[amount, toId]
);
await client.query('COMMIT');
} catch (err) {
await client.query('ROLLBACK');
throw err;
} finally {
client.release();
}
}
// 3. MYSQL with mysql2
const mysql = require('mysql2/promise');
// Create pool
const mysqlPool = mysql.createPool({
host: 'localhost',
user: 'root',
password: 'password',
database: 'mydb',
waitForConnections: true,
connectionLimit: 10,
queueLimit: 0
});
// Execute queries
async function getUsers() {
const [rows, fields] = await mysqlPool.execute(
'SELECT * FROM users WHERE age > ?',
[18]
);
return rows;
}
// Prepared statements
async function insertUser(name, email, age) {
const [result] = await mysqlPool.execute(
'INSERT INTO users (name, email, age) VALUES (?, ?, ?)',
[name, email, age]
);
return result.insertId;
}
// 4. Connection management with Express
const express = require('express');
const app = express();
// Database middleware
app.use(async (req, res, next) => {
try {
req.db = await pool.connect();
res.on('finish', () => {
req.db.release();
});
next();
} catch (err) {
next(err);
}
});
app.get('/users', async (req, res) => {
try {
const result = await req.db.query('SELECT * FROM users');
res.json(result.rows);
} catch (err) {
res.status(500).json({ error: err.message });
}
});
// Graceful shutdown
process.on('SIGINT', async () => {
await pool.end();
await mysqlPool.end();
await mongoose.connection.close();
console.log('Database connections closed');
process.exit(0);
});
app.listen(3000);Command line arguments can be accessed through process.argv, or using libraries like yargs or commander for more complex argument parsing.
// 1. Using process.argv
// Run: node app.js arg1 arg2 --flag=value
console.log(process.argv);
// Output: ['node', '/path/to/app.js', 'arg1', 'arg2', '--flag=value']
// Access arguments
const args = process.argv.slice(2); // Remove 'node' and script path
console.log('Arguments:', args);
// Parse flags manually
function parseArgs(args) {
const parsed = {};
args.forEach(arg => {
if (arg.startsWith('--')) {
const [key, value] = arg.slice(2).split('=');
parsed[key] = value || true;
}
});
return parsed;
}
const options = parseArgs(process.argv.slice(2));
console.log('Options:', options);
// 2. Using yargs
const yargs = require('yargs/yargs');
const { hideBin } = require('yargs/helpers');
const argv = yargs(hideBin(process.argv))
.option('port', {
alias: 'p',
type: 'number',
description: 'Port to run server on',
default: 3000
})
.option('host', {
alias: 'h',
type: 'string',
description: 'Host address',
default: 'localhost'
})
.option('verbose', {
alias: 'v',
type: 'boolean',
description: 'Enable verbose logging'
})
.command('start', 'Start the server', {}, (argv) => {
console.log(`Starting server on ${argv.host}:${argv.port}`);
})
.command('stop', 'Stop the server', {}, () => {
console.log('Stopping server');
})
.help()
.argv;
// Run: node app.js start --port=8080 --verbose
// 3. Using commander
const { program } = require('commander');
program
.version('1.0.0')
.description('A sample CLI application')
.option('-p, --port <number>', 'Port number', '3000')
.option('-h, --host <string>', 'Host address', 'localhost')
.option('-v, --verbose', 'Enable verbose output')
.option('-d, --debug', 'Enable debug mode');
program
.command('start')
.description('Start the server')
.action(() => {
const options = program.opts();
console.log(`Starting server on ${options.host}:${options.port}`);
if (options.verbose) {
console.log('Verbose mode enabled');
}
});
program
.command('create <name>')
.description('Create a new resource')
.option('-t, --type <type>', 'Resource type')
.action((name, options) => {
console.log(`Creating ${options.type || 'default'}: ${name}`);
});
program.parse(process.argv);
// 4. Environment variables with dotenv
require('dotenv').config();
const config = {
port: process.env.PORT || 3000,
dbUrl: process.env.DB_URL || 'mongodb://localhost:27017/mydb',
jwtSecret: process.env.JWT_SECRET
};
console.log('Configuration:', config);
// 5. Combined approach
const express = require('express');
const app = express();
const PORT = process.argv[2] || process.env.PORT || 3000;
app.get('/', (req, res) => {
res.send('Hello World');
});
app.listen(PORT, () => {
console.log(`Server running on port ${PORT}`);
});
// Run: node app.js 8080Child processes allow Node.js to execute other programs or scripts in separate processes. The child_process module provides four main methods: exec(), execFile(), spawn(), and fork().
const { exec, execFile, spawn, fork } = require('child_process');
// 1. exec() - Execute shell command with buffered output
exec('ls -la', (error, stdout, stderr) => {
if (error) {
console.error(`Error: ${error.message}`);
return;
}
if (stderr) {
console.error(`stderr: ${stderr}`);
return;
}
console.log(`stdout: ${stdout}`);
});
// With options
exec('cat *.js | wc -l', {
cwd: '/home/user/projects',
env: { ...process.env, NODE_ENV: 'production' },
maxBuffer: 1024 * 1024
}, (error, stdout, stderr) => {
console.log('Number of lines:', stdout.trim());
});
// 2. execFile() - Execute file without shell (more secure)
execFile('node', ['--version'], (error, stdout, stderr) => {
if (error) {
console.error(`Error: ${error.message}`);
return;
}
console.log('Node version:', stdout);
});
// 3. spawn() - Stream-based for large data
const ls = spawn('ls', ['-lh', '/usr']);
ls.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
ls.stderr.on('data', (data) => {
console.error(`stderr: ${data}`);
});
ls.on('close', (code) => {
console.log(`Process exited with code ${code}`);
});
// Piping with spawn
const find = spawn('find', ['.', '-type', 'f']);
const wc = spawn('wc', ['-l']);
find.stdout.pipe(wc.stdin);
wc.stdout.on('data', (data) => {
console.log(`Number of files: ${data}`);
});
// 4. fork() - For Node.js scripts with IPC
// worker.js
// process.on('message', (msg) => {
// const result = heavyComputation(msg.data);
// process.send({ result });
// });
const worker = fork('./worker.js');
worker.on('message', (msg) => {
console.log('Result from worker:', msg.result);
});
worker.send({ data: [1, 2, 3, 4, 5] });
// 5. Advanced: Process pool for CPU-intensive tasks
class ProcessPool {
constructor(script, poolSize = 4) {
this.script = script;
this.poolSize = poolSize;
this.workers = [];
this.queue = [];
for (let i = 0; i < poolSize; i++) {
this.workers.push({
child: fork(script),
busy: false
});
}
}
exec(data) {
return new Promise((resolve, reject) => {
const worker = this.workers.find(w => !w.busy);
if (worker) {
worker.busy = true;
worker.child.once('message', (result) => {
worker.busy = false;
resolve(result);
this.processQueue();
});
worker.child.send(data);
} else {
this.queue.push({ data, resolve, reject });
}
});
}
processQueue() {
if (this.queue.length === 0) return;
const worker = this.workers.find(w => !w.busy);
if (!worker) return;
const { data, resolve, reject } = this.queue.shift();
worker.busy = true;
worker.child.once('message', (result) => {
worker.busy = false;
resolve(result);
this.processQueue();
});
worker.child.send(data);
}
destroy() {
this.workers.forEach(w => w.child.kill());
}
}
// Usage
const pool = new ProcessPool('./worker.js', 4);
async function processTasks() {
const results = await Promise.all([
pool.exec({ task: 'A' }),
pool.exec({ task: 'B' }),
pool.exec({ task: 'C' }),
pool.exec({ task: 'D' })
]);
console.log('All results:', results);
pool.destroy();
}
// 6. Handling child process errors
const child = spawn('bad-command');
child.on('error', (error) => {
console.error('Failed to start child process:', error);
});
child.on('exit', (code, signal) => {
if (code !== 0) {
console.error(`Process exited with code ${code}`);
}
if (signal) {
console.error(`Process killed with signal ${signal}`);
}
});
// Timeout for child processes
const longRunning = spawn('some-long-task');
const timeout = setTimeout(() => {
longRunning.kill('SIGTERM');
console.log('Process terminated due to timeout');
}, 5000);
longRunning.on('exit', () => {
clearTimeout(timeout);
});This comprehensive guide covers 41 essential Node.js interview questions ranging from basic concepts to advanced topics. Key areas include:
- Fundamentals: Node.js architecture, libuv, modules, npm, REPL
- Asynchronous Programming: Event loop, callbacks, promises, async/await
- Core Modules: fs, http, crypto, streams, timers, child processes
- Web Development: Express, middleware, sessions, CORS, authentication
- Advanced Topics: Clustering, TLS, database connections, file uploads
- Best Practices: Error handling, security, performance optimization
Understanding these concepts, especially the role of libuv in Node.js's architecture, will help you build scalable, efficient, and secure Node.js applications.