Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Select an option

  • Save ImBIOS/616ee316d94c974e0b31df4ee1b81947 to your computer and use it in GitHub Desktop.

Select an option

Save ImBIOS/616ee316d94c974e0b31df4ee1b81947 to your computer and use it in GitHub Desktop.

𝗝𝗮𝘃𝗮𝗦𝗰𝗿𝗶𝗽𝘁 𝗮𝗻𝗱 𝗥𝗲𝗮𝗰𝘁𝗝𝗦 (𝗙𝗿𝗼𝗻𝘁-𝗘𝗻𝗱)

  • What is the difference between var, let, and const in JavaScript?
  • How does the JavaScript event loop work?
  • Explain closures in JavaScript with an example.
  • What are promises, and how do they differ from async/await?
  • What is the difference between shallow and deep copying in JavaScript?
  • Explain the difference between == and === in JavaScript.
  • What is the virtual DOM, and how does React use it?
  • Explain the purpose of React hooks. How does useEffect work?
  • What is the difference between controlled and uncontrolled components in React?
  • What is the significance of key props in React lists?

𝗡𝗼𝗱𝗲𝗝𝗦 𝗮𝗻𝗱 𝗕𝗮𝗰𝗸𝗲𝗻𝗱 𝗗𝗲𝘃𝗲𝗹𝗼𝗽𝗺𝗲𝗻𝘁

  • What is event-driven architecture in NodeJS?
  • How does NodeJS handle asynchronous operations?
  • What are middleware functions in ExpressJS?
  • How does authentication work in NodeJS? Explain JWT vs. OAuth.
  • What are WebSockets, and when would you use them?
  • What is the difference between monolithic and microservices architectures?
  • How does NodeJS handle memory management?

𝗗𝗮𝘁𝗮𝗯𝗮𝘀𝗲𝘀 𝗮𝗻𝗱 𝗦𝘆𝘀𝘁𝗲𝗺 𝗗𝗲𝘀𝗶𝗴𝗻

  • What is the difference between SQL and NoSQL databases?
  • How does indexing work in databases?
  • What are ACID properties in a database?
  • How would you scale an API to handle millions of requests per second?
  • Design elevator system
  • Design a parking lot
  • What is caching, and how does it improve performance?
  • Design a rate limiter
  • Design a logging system
  • Design a pastebin/ code sharing app

𝗗𝗦𝗔

  • Given an array, find the maximum sum of any contiguous subarray.
  • Find the first non-repeating character in a string.
  • Detect if a linked list contains a cycle.
  • Merge k sorted linked lists into one sorted list.
  • Given an array and a number k, return the max in every sliding window of size k.
  • Return the longest palindromic substring in a given string.
  • Place k cows in n stalls to maximize the minimum distance between any two cows.
  • Given course prerequisites, determine if all courses can be finished.

Technical Interview Questions & Answers

JavaScript and ReactJS (Front-End)

What is the difference between var, let, and const in JavaScript?

var:

  • Function-scoped or globally-scoped
  • Hoisted and initialized with undefined
  • Can be re-declared and re-assigned
  • Creates property on global object when declared globally

let:

  • Block-scoped
  • Hoisted but not initialized (Temporal Dead Zone)
  • Can be re-assigned but not re-declared in same scope
  • No global object property

const:

  • Block-scoped
  • Hoisted but not initialized (Temporal Dead Zone)
  • Cannot be re-assigned or re-declared
  • Must be initialized at declaration
  • Objects/arrays are still mutable
function example() {
  console.log(x); // undefined (hoisted)
  var x = 1;

  console.log(y); // ReferenceError (TDZ)
  let y = 2;

  const z = 3;
  z = 4; // TypeError
}

How does the JavaScript event loop work?

The event loop is JavaScript's concurrency model that handles asynchronous operations:

  1. Call Stack: Executes synchronous code (LIFO)
  2. Web APIs: Handle async operations (setTimeout, DOM events, HTTP requests)
  3. Callback Queue: Stores callbacks from completed async operations (FIFO)
  4. Microtask Queue: Higher priority queue for Promises, queueMicrotask (FIFO)
  5. Event Loop: Moves tasks from queues to call stack when stack is empty

Execution Order:

  1. Execute all synchronous code
  2. Process all microtasks (Promises, async/await)
  3. Process one macrotask (setTimeout, setInterval, I/O)
  4. Process all microtasks again
  5. Repeat steps 3-4
console.log('1'); // Sync

setTimeout(() => console.log('2'), 0); // Macrotask

Promise.resolve().then(() => console.log('3')); // Microtask

console.log('4'); // Sync

// Output: 1, 4, 3, 2

Explain closures in JavaScript with an example.

A closure is a function that retains access to variables from its outer (lexical) scope even after the outer function has returned.

function createCounter() {
  let count = 0;

  return {
    increment: () => ++count,
    decrement: () => --count,
    getValue: () => count
  };
}

const counter = createCounter();
console.log(counter.increment()); // 1
console.log(counter.increment()); // 2
console.log(counter.getValue()); // 2

// count is private and can't be accessed directly
console.log(counter.count); // undefined

Use cases: Module pattern, callbacks, partial application, data privacy.

What are promises, and how do they differ from async/await?

Promises: Objects representing eventual completion/failure of async operations.

States: Pending → Fulfilled/Rejected

// Promise
function fetchData() {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      const success = Math.random() > 0.5;
      success ? resolve('Data') : reject('Error');
    }, 1000);
  });
}

fetchData()
  .then(data => console.log(data))
  .catch(err => console.error(err));

Async/Await: Syntactic sugar over promises, makes async code look synchronous.

// Async/Await
async function getData() {
  try {
    const data = await fetchData();
    console.log(data);
  } catch (err) {
    console.error(err);
  }
}

Key Differences:

  • Async/await is more readable for sequential operations
  • Promises better for parallel operations with Promise.all()
  • Error handling: .catch() vs try/catch
  • Async functions always return promises

What is the difference between shallow and deep copying in JavaScript?

Shallow Copy: Copies only the first level of properties.

const original = { a: 1, b: { c: 2 } };

// Shallow copy methods
const shallow1 = { ...original };
const shallow2 = Object.assign({}, original);
const shallow3 = Object.create(original);

shallow1.b.c = 3;
console.log(original.b.c); // 3 (mutated!)

Deep Copy: Recursively copies all nested objects/arrays.

// Deep copy methods
const deep1 = JSON.parse(JSON.stringify(original)); // Limited
const deep2 = structuredClone(original); // Modern browsers

// Custom implementation
function deepClone(obj) {
  if (obj === null || typeof obj !== 'object') return obj;
  if (obj instanceof Date) return new Date(obj);
  if (obj instanceof Array) return obj.map(deepClone);

  const cloned = {};
  Object.keys(obj).forEach(key => {
    cloned[key] = deepClone(obj[key]);
  });
  return cloned;
}

Explain the difference between == and === in JavaScript.

== (Loose Equality): Type coercion before comparison. === (Strict Equality): No type coercion, checks type and value.

// Type coercion examples
console.log(5 == '5');    // true
console.log(5 === '5');   // false

console.log(null == undefined);  // true
console.log(null === undefined); // false

console.log(0 == false);   // true
console.log(0 === false);  // false

console.log('' == false);  // true
console.log('' === false); // false

// Always use === unless you specifically need coercion

What is the virtual DOM, and how does React use it?

Virtual DOM: JavaScript representation of the actual DOM. It's a programming concept where a "virtual" representation of UI is kept in memory and synced with the "real" DOM.

React's Process:

  1. State changes trigger re-render
  2. New virtual DOM tree is created
  3. Diffing algorithm compares old vs new virtual DOM
  4. Reconciliation updates only changed parts of real DOM
// Virtual DOM representation
const virtualElement = {
  type: 'div',
  props: {
    className: 'container',
    children: [
      { type: 'h1', props: { children: 'Hello' } },
      { type: 'p', props: { children: 'World' } }
    ]
  }
};

// React Fiber (current implementation) uses:
// - Incremental rendering
// - Prioritization of updates
// - Interruptible work

Benefits:

  • Predictable updates
  • Batch DOM updates
  • Cross-browser compatibility
  • Enables features like time-travel debugging

Explain the purpose of React hooks. How does useEffect work?

Hooks: Functions that let you use state and other React features in functional components.

Rules:

  1. Only call at top level (not in loops/conditions)
  2. Only call from React functions
import { useState, useEffect, useCallback, useMemo } from 'react';

function UserProfile({ userId }) {
  const [user, setUser] = useState(null);
  const [loading, setLoading] = useState(true);

  // useEffect for side effects
  useEffect(() => {
    let cancelled = false;

    async function fetchUser() {
      setLoading(true);
      try {
        const userData = await api.getUser(userId);
        if (!cancelled) {
          setUser(userData);
        }
      } catch (error) {
        if (!cancelled) {
          console.error('Failed to fetch user:', error);
        }
      } finally {
        if (!cancelled) {
          setLoading(false);
        }
      }
    }

    fetchUser();

    // Cleanup function
    return () => {
      cancelled = true;
    };
  }, [userId]); // Dependency array

  const handleUpdate = useCallback((updates) => {
    setUser(prev => ({ ...prev, ...updates }));
  }, []);

  const displayName = useMemo(() => {
    return user ? `${user.firstName} ${user.lastName}` : '';
  }, [user]);

  if (loading) return <div>Loading...</div>;

  return (
    <div>
      <h1>{displayName}</h1>
      {/* ... */}
    </div>
  );
}

useEffect Patterns:

  • No deps: runs after every render
  • Empty deps []: runs once after mount
  • With deps [userId]: runs when dependencies change

What is the difference between controlled and uncontrolled components in React?

Controlled Components: React controls the form data via state.

function ControlledForm() {
  const [email, setEmail] = useState('');
  const [password, setPassword] = useState('');

  const handleSubmit = (e) => {
    e.preventDefault();
    console.log({ email, password });
  };

  return (
    <form onSubmit={handleSubmit}>
      <input
        type="email"
        value={email} // Controlled by React state
        onChange={(e) => setEmail(e.target.value)}
      />
      <input
        type="password"
        value={password}
        onChange={(e) => setPassword(e.target.value)}
      />
      <button type="submit">Submit</button>
    </form>
  );
}

Uncontrolled Components: DOM handles the form data, React uses refs.

function UncontrolledForm() {
  const emailRef = useRef();
  const passwordRef = useRef();

  const handleSubmit = (e) => {
    e.preventDefault();
    console.log({
      email: emailRef.current.value,
      password: passwordRef.current.value
    });
  };

  return (
    <form onSubmit={handleSubmit}>
      <input
        type="email"
        ref={emailRef}
        defaultValue="" // Default value, not controlled
      />
      <input
        type="password"
        ref={passwordRef}
        defaultValue=""
      />
      <button type="submit">Submit</button>
    </form>
  );
}

When to use:

  • Controlled: Complex validation, conditional rendering, multiple forms
  • Uncontrolled: Simple forms, integrating with non-React code

What is the significance of key props in React lists?

Keys help React identify which items have changed, added, or removed. They should be stable, predictable, and unique among siblings.

// ❌ Bad - using array index
function BadList({ items }) {
  return (
    <ul>
      {items.map((item, index) => (
        <li key={index}>{item.name}</li> // Problems with reordering
      ))}
    </ul>
  );
}

// ✅ Good - using stable unique identifier
function GoodList({ items }) {
  return (
    <ul>
      {items.map((item) => (
        <li key={item.id}>{item.name}</li>
      ))}
    </ul>
  );
}

// ✅ Complex example with state
function TodoList() {
  const [todos, setTodos] = useState([
    { id: '1', text: 'Learn React', completed: false },
    { id: '2', text: 'Build app', completed: false }
  ]);

  return (
    <ul>
      {todos.map((todo) => (
        <TodoItem
          key={todo.id} // Preserves component state during reorders
          todo={todo}
          onToggle={(id) => {
            setTodos(todos.map(t =>
              t.id === id ? { ...t, completed: !t.completed } : t
            ));
          }}
        />
      ))}
    </ul>
  );
}

Without proper keys: React may reuse components incorrectly, causing state issues and performance problems.


NodeJS and Backend Development

What is event-driven architecture in NodeJS?

Event-driven architecture uses events to trigger and communicate between decoupled services. NodeJS is built around this pattern using the EventEmitter class.

const EventEmitter = require('events');

class OrderService extends EventEmitter {
  async createOrder(orderData) {
    try {
      // Process order
      const order = await this.saveOrder(orderData);

      // Emit events for different services
      this.emit('order.created', order);
      this.emit('inventory.reserve', order.items);
      this.emit('payment.process', order.payment);

      return order;
    } catch (error) {
      this.emit('order.failed', { orderData, error });
      throw error;
    }
  }

  async saveOrder(data) {
    // Database logic
    return { id: Date.now(), ...data };
  }
}

// Service implementations
class InventoryService {
  constructor(orderService) {
    orderService.on('inventory.reserve', this.reserveItems.bind(this));
  }

  async reserveItems(items) {
    console.log('Reserving items:', items);
    // Reserve inventory logic
  }
}

class PaymentService {
  constructor(orderService) {
    orderService.on('payment.process', this.processPayment.bind(this));
  }

  async processPayment(paymentData) {
    console.log('Processing payment:', paymentData);
    // Payment processing logic
  }
}

// Usage
const orderService = new OrderService();
const inventoryService = new InventoryService(orderService);
const paymentService = new PaymentService(orderService);

orderService.createOrder({
  items: [{ id: 1, quantity: 2 }],
  payment: { amount: 100, method: 'card' }
});

Benefits:

  • Loose coupling between services
  • Scalability and maintainability
  • Easy to add new features
  • Natural fit for microservices

How does NodeJS handle asynchronous operations?

NodeJS uses an event-driven, non-blocking I/O model with a single-threaded event loop and thread pool for I/O operations.

const fs = require('fs').promises;
const util = require('util');
const { Worker } = require('worker_threads');

// 1. Callback Pattern (older)
function readFileCallback(filename, callback) {
  fs.readFile(filename, (err, data) => {
    if (err) return callback(err);
    callback(null, data.toString());
  });
}

// 2. Promise Pattern
async function readFilePromise(filename) {
  try {
    const data = await fs.readFile(filename);
    return data.toString();
  } catch (error) {
    throw error;
  }
}

// 3. Stream Pattern for large files
const stream = require('stream');
const { pipeline } = require('stream/promises');

async function processLargeFile(inputFile, outputFile) {
  const readStream = fs.createReadStream(inputFile);
  const writeStream = fs.createWriteStream(outputFile);

  const transform = new stream.Transform({
    transform(chunk, encoding, callback) {
      // Process chunk
      const processed = chunk.toString().toUpperCase();
      callback(null, processed);
    }
  });

  await pipeline(readStream, transform, writeStream);
}

// 4. CPU-intensive tasks with Worker Threads
function fibonacciWorker(n) {
  return new Promise((resolve, reject) => {
    const worker = new Worker(`
      const { parentPort } = require('worker_threads');

      function fibonacci(n) {
        if (n < 2) return n;
        return fibonacci(n - 1) + fibonacci(n - 2);
      }

      parentPort.on('message', (n) => {
        const result = fibonacci(n);
        parentPort.postMessage(result);
      });
    `, { eval: true });

    worker.postMessage(n);
    worker.on('message', resolve);
    worker.on('error', reject);
  });
}

// Usage example
async function main() {
  // Parallel async operations
  const [file1, file2, fibResult] = await Promise.all([
    readFilePromise('file1.txt'),
    readFilePromise('file2.txt'),
    fibonacciWorker(35)
  ]);

  console.log('All operations completed');
}

Key Components:

  • Event Loop: Manages execution of callbacks
  • Thread Pool: Handles file I/O, DNS, CPU-intensive tasks
  • libuv: C++ library providing async I/O

What are middleware functions in ExpressJS?

Middleware functions execute during the request-response cycle. They can modify req/res objects, end the cycle, or call next middleware.

const express = require('express');
const jwt = require('jsonwebtoken');
const rateLimit = require('express-rate-limit');

const app = express();

// 1. Built-in middleware
app.use(express.json()); // Parse JSON bodies
app.use(express.static('public')); // Serve static files

// 2. Third-party middleware
app.use(rateLimit({
  windowMs: 15 * 60 * 1000, // 15 minutes
  max: 100 // limit each IP to 100 requests per windowMs
}));

// 3. Custom middleware
const logger = (req, res, next) => {
  console.log(`${new Date().toISOString()} - ${req.method} ${req.path}`);
  next(); // Must call next() to continue
};

const authenticate = async (req, res, next) => {
  try {
    const token = req.headers.authorization?.split(' ')[1];
    if (!token) {
      return res.status(401).json({ error: 'No token provided' });
    }

    const decoded = jwt.verify(token, process.env.JWT_SECRET);
    req.user = decoded;
    next();
  } catch (error) {
    res.status(401).json({ error: 'Invalid token' });
  }
};

const authorize = (roles) => {
  return (req, res, next) => {
    if (!roles.includes(req.user.role)) {
      return res.status(403).json({ error: 'Insufficient permissions' });
    }
    next();
  };
};

// Error handling middleware
const errorHandler = (err, req, res, next) => {
  console.error(err.stack);

  if (err.name === 'ValidationError') {
    return res.status(400).json({ error: err.message });
  }

  if (err.name === 'CastError') {
    return res.status(400).json({ error: 'Invalid ID format' });
  }

  res.status(500).json({ error: 'Internal server error' });
};

// Apply middleware
app.use(logger);

// Route-specific middleware
app.get('/protected', authenticate, (req, res) => {
  res.json({ message: 'Access granted', user: req.user });
});

app.get('/admin', authenticate, authorize(['admin']), (req, res) => {
  res.json({ message: 'Admin access granted' });
});

// Error handler must be last
app.use(errorHandler);

Types:

  • Application-level: app.use()
  • Router-level: router.use()
  • Error-handling: 4 parameters (err, req, res, next)
  • Third-party: External packages

How does authentication work in NodeJS? Explain JWT vs. OAuth.

JWT (JSON Web Tokens): Stateless authentication using signed tokens.

const jwt = require('jsonwebtoken');
const bcrypt = require('bcrypt');

// JWT Implementation
class AuthService {
  static generateTokens(user) {
    const payload = { id: user.id, email: user.email, role: user.role };

    const accessToken = jwt.sign(payload, process.env.JWT_SECRET, {
      expiresIn: '15m'
    });

    const refreshToken = jwt.sign(payload, process.env.REFRESH_SECRET, {
      expiresIn: '7d'
    });

    return { accessToken, refreshToken };
  }

  static verifyToken(token) {
    return jwt.verify(token, process.env.JWT_SECRET);
  }

  static async hashPassword(password) {
    return bcrypt.hash(password, 12);
  }

  static async comparePassword(password, hash) {
    return bcrypt.compare(password, hash);
  }
}

// Login route
app.post('/login', async (req, res) => {
  try {
    const { email, password } = req.body;

    const user = await User.findOne({ email });
    if (!user) {
      return res.status(401).json({ error: 'Invalid credentials' });
    }

    const isValid = await AuthService.comparePassword(password, user.password);
    if (!isValid) {
      return res.status(401).json({ error: 'Invalid credentials' });
    }

    const tokens = AuthService.generateTokens(user);

    // Store refresh token in httpOnly cookie
    res.cookie('refreshToken', tokens.refreshToken, {
      httpOnly: true,
      secure: process.env.NODE_ENV === 'production',
      sameSite: 'strict',
      maxAge: 7 * 24 * 60 * 60 * 1000 // 7 days
    });

    res.json({ accessToken: tokens.accessToken, user: { id: user.id, email: user.email } });
  } catch (error) {
    res.status(500).json({ error: 'Internal server error' });
  }
});

OAuth 2.0: Delegated authorization framework.

const passport = require('passport');
const GoogleStrategy = require('passport-google-oauth20').Strategy;

// OAuth with Passport.js
passport.use(new GoogleStrategy({
  clientID: process.env.GOOGLE_CLIENT_ID,
  clientSecret: process.env.GOOGLE_CLIENT_SECRET,
  callbackURL: "/auth/google/callback"
}, async (accessToken, refreshToken, profile, done) => {
  try {
    let user = await User.findOne({ googleId: profile.id });

    if (!user) {
      user = await User.create({
        googleId: profile.id,
        email: profile.emails[0].value,
        name: profile.displayName
      });
    }

    return done(null, user);
  } catch (error) {
    return done(error, null);
  }
}));

// Routes
app.get('/auth/google',
  passport.authenticate('google', { scope: ['profile', 'email'] })
);

app.get('/auth/google/callback',
  passport.authenticate('google', { session: false }),
  (req, res) => {
    const tokens = AuthService.generateTokens(req.user);
    res.redirect(`${process.env.CLIENT_URL}/dashboard?token=${tokens.accessToken}`);
  }
);

JWT vs OAuth:

Aspect JWT OAuth 2.0
Purpose Authentication & Authorization Authorization Framework
Stateless Yes Depends on implementation
Token Storage Client-side Server can store tokens
Revocation Difficult (use short expiry) Easy (revoke at auth server)
Use Case Internal APIs, microservices Third-party integration

What are WebSockets, and when would you use them?

WebSockets provide full-duplex communication between client and server over a single TCP connection.

const WebSocket = require('ws');
const jwt = require('jsonwebtoken');

class WebSocketServer {
  constructor(port) {
    this.wss = new WebSocket.Server({
      port,
      verifyClient: this.authenticateClient.bind(this)
    });

    this.clients = new Map(); // userId -> WebSocket
    this.rooms = new Map();   // roomId -> Set of userIds

    this.wss.on('connection', this.handleConnection.bind(this));
  }

  authenticateClient(info) {
    try {
      const token = new URL(info.req.url, 'http://localhost').searchParams.get('token');
      const user = jwt.verify(token, process.env.JWT_SECRET);
      info.req.user = user;
      return true;
    } catch {
      return false;
    }
  }

  handleConnection(ws, req) {
    const user = req.user;
    console.log(`User ${user.id} connected`);

    // Store client connection
    this.clients.set(user.id, ws);

    ws.on('message', (data) => {
      try {
        const message = JSON.parse(data);
        this.handleMessage(user, message);
      } catch (error) {
        ws.send(JSON.stringify({ error: 'Invalid message format' }));
      }
    });

    ws.on('close', () => {
      console.log(`User ${user.id} disconnected`);
      this.clients.delete(user.id);

      // Remove from all rooms
      for (const [roomId, users] of this.rooms.entries()) {
        users.delete(user.id);
        if (users.size === 0) {
          this.rooms.delete(roomId);
        }
      }
    });

    ws.on('error', (error) => {
      console.error(`WebSocket error for user ${user.id}:`, error);
    });

    // Send welcome message
    ws.send(JSON.stringify({
      type: 'connected',
      message: 'Welcome to the chat!'
    }));
  }

  handleMessage(user, message) {
    switch (message.type) {
      case 'join_room':
        this.joinRoom(user.id, message.roomId);
        break;

      case 'leave_room':
        this.leaveRoom(user.id, message.roomId);
        break;

      case 'chat_message':
        this.broadcastToRoom(message.roomId, {
          type: 'chat_message',
          user: { id: user.id, name: user.name },
          message: message.content,
          timestamp: new Date().toISOString()
        });
        break;

      case 'private_message':
        this.sendPrivateMessage(user.id, message.targetUserId, message.content);
        break;
    }
  }

  joinRoom(userId, roomId) {
    if (!this.rooms.has(roomId)) {
      this.rooms.set(roomId, new Set());
    }

    this.rooms.get(roomId).add(userId);

    // Notify room members
    this.broadcastToRoom(roomId, {
      type: 'user_joined',
      userId,
      roomId
    });
  }

  leaveRoom(userId, roomId) {
    const room = this.rooms.get(roomId);
    if (room) {
      room.delete(userId);

      // Notify room members
      this.broadcastToRoom(roomId, {
        type: 'user_left',
        userId,
        roomId
      });
    }
  }

  broadcastToRoom(roomId, message) {
    const room = this.rooms.get(roomId);
    if (room) {
      room.forEach(userId => {
        const client = this.clients.get(userId);
        if (client && client.readyState === WebSocket.OPEN) {
          client.send(JSON.stringify(message));
        }
      });
    }
  }

  sendPrivateMessage(fromUserId, toUserId, content) {
    const targetClient = this.clients.get(toUserId);
    if (targetClient && targetClient.readyState === WebSocket.OPEN) {
      targetClient.send(JSON.stringify({
        type: 'private_message',
        from: fromUserId,
        content,
        timestamp: new Date().toISOString()
      }));
    }
  }
}

// Start WebSocket server
const wsServer = new WebSocketServer(8080);

// Integration with HTTP server for scaling
const express = require('express');
const http = require('http');

const app = express();
const server = http.createServer(app);

// Attach WebSocket server to HTTP server
const wss = new WebSocket.Server({ server });

server.listen(3000, () => {
  console.log('Server running on port 3000');
});

Use Cases:

  • Real-time chat applications
  • Live gaming
  • Collaborative editing
  • Real-time dashboards
  • Trading platforms
  • Live notifications

When NOT to use WebSockets:

  • Simple request-response patterns
  • Infrequent updates (use polling/SSE)
  • High bandwidth overhead for small messages

What is the difference between monolithic and microservices architectures?

Monolithic Architecture: Single deployable unit containing all functionality.

// Monolithic Express App
const express = require('express');
const app = express();

// All services in one application
class UserService {
  static async createUser(userData) {
    // User creation logic
    const user = await db.users.create(userData);

    // Send welcome email (tightly coupled)
    await EmailService.sendWelcomeEmail(user);

    // Update analytics (tightly coupled)
    await AnalyticsService.trackUserRegistration(user);

    return user;
  }
}

class OrderService {
  static async createOrder(orderData) {
    // Order logic
    const order = await db.orders.create(orderData);

    // Process payment (tightly coupled)
    await PaymentService.processPayment(order);

    // Update inventory (tightly coupled)
    await InventoryService.updateStock(order.items);

    return order;
  }
}

// All routes in one app
app.post('/users', async (req, res) => {
  const user = await UserService.createUser(req.body);
  res.json(user);
});

app.post('/orders', async (req, res) => {
  const order = await OrderService.createOrder(req.body);
  res.json(order);
});

app.listen(3000);

Microservices Architecture: Multiple independent services.

// User Service (separate app)
const express = require('express');
const amqp = require('amqplib');

class UserService {
  constructor() {
    this.setupMessageQueue();
  }

  async setupMessageQueue() {
    this.connection = await amqp.connect('amqp://localhost');
    this.channel = await this.connection.createChannel();
  }

  async createUser(userData) {
    const user = await db.users.create(userData);

    // Publish events instead of direct calls
    await this.publishEvent('user.created', user);

    return user;
  }

  async publishEvent(eventType, data) {
    await this.channel.publish('events', eventType, Buffer.from(JSON.stringify(data)));
  }
}

const userService = new UserService();
const app = express();

app.post('/users', async (req, res) => {
  try {
    const user = await userService.createUser(req.body);
    res.json(user);
  } catch (error) {
    res.status(500).json({ error: error.message });
  }
});

app.listen(3001);

// Email Service (separate app)
const emailApp = express();

class EmailService {
  constructor() {
    this.setupMessageQueue();
  }

  async setupMessageQueue() {
    this.connection = await amqp.connect('amqp://localhost');
    this.channel = await this.connection.createChannel();

    // Subscribe to events
    this.channel.consume('user.created', this.handleUserCreated.bind(this));
  }

  async handleUserCreated(message) {
    const user = JSON.parse(message.content.toString());
    await this.sendWelcomeEmail(user);
    this.channel.ack(message);
  }

  async sendWelcomeEmail(user) {
    // Email logic
    console.log(`Sending welcome email to ${user.email}`);
  }
}

new EmailService();
emailApp.listen(3002);

// API Gateway
const gateway = express();
const httpProxy = require('http-proxy-middleware');

// Route to appropriate services
gateway.use('/api/users', httpProxy({
  target: 'http://localhost:3001',
  changeOrigin: true,
  pathRewrite: { '^/api/users': '/users' }
}));

gateway.use('/api/orders', httpProxy({
  target: 'http://localhost:3003',
  changeOrigin: true,
  pathRewrite: { '^/api/orders': '/orders' }
}));

gateway.listen(3000);

Comparison:

Aspect Monolithic Microservices
Deployment Single unit Independent services
Scalability Scale entire app Scale individual services
Technology Single stack Different stacks per service
Data Shared database Database per service
Communication In-process Network calls (HTTP/messaging)
Complexity Lower initially Higher operational complexity
Development Easier coordination Independent teams
Testing Simpler integration Complex distributed testing
Failure Single point of failure Fault isolation

How does NodeJS handle memory management?

NodeJS uses V8's garbage collector with automatic memory management, but understanding memory usage is crucial for performance.

// Memory monitoring and optimization
const v8 = require('v8');
const process = require('process');

class MemoryMonitor {
  static getMemoryUsage() {
    const usage = process.memoryUsage();
    const heapStats = v8.getHeapStatistics();

    return {
      // Process memory usage
      rss: `${Math.round(usage.rss / 1024 / 1024)}MB`, // Resident Set Size
      heapUsed: `${Math.round(usage.heapUsed / 1024 / 1024)}MB`,
      heapTotal: `${Math.round(usage.heapTotal / 1024 / 1024)}MB`,
      external: `${Math.round(usage.external / 1024 / 1024)}MB`,

      // V8 heap statistics
      totalHeapSize: `${Math.round(heapStats.total_heap_size / 1024 / 1024)}MB`,
      usedHeapSize: `${Math.round(heapStats.used_heap_size / 1024 / 1024)}MB`,
      heapSizeLimit: `${Math.round(heapStats.heap_size_limit / 1024 / 1024)}MB`,
    };
  }

  static startMonitoring(intervalMs = 5000) {
    setInterval(() => {
      const memory = this.getMemoryUsage();
      console.log('Memory Usage:', memory);

      // Alert if memory usage is high
      const heapUsedMB = parseInt(memory.heapUsed);
      const heapLimitMB = parseInt(memory.heapSizeLimit);

      if (heapUsedMB / heapLimitMB > 0.8) {
        console.warn('HIGH MEMORY USAGE DETECTED!');

        // Force garbage collection (only in development)
        if (global.gc && process.env.NODE_ENV !== 'production') {
          global.gc();
          console.log('Garbage collection forced');
        }
      }
    }, intervalMs);
  }
}

// Memory leak examples and fixes

// ❌ Memory leak: Event listeners not removed
class BadEventHandler {
  constructor() {
    this.data = new Array(1000000).fill('data');

    process.on('SIGUSR1', this.handleSignal.bind(this));
    // Missing cleanup!
  }

  handleSignal() {
    console.log('Signal received');
  }
}

// ✅ Good: Proper cleanup
class GoodEventHandler {
  constructor() {
    this.data = new Array(1000000).fill('data');
    this.handleSignal = this.handleSignal.bind(this);

    process.on('SIGUSR1', this.handleSignal);
  }

  handleSignal() {
    console.log('Signal received');
  }

  destroy() {
    process.removeListener('SIGUSR1', this.handleSignal);
    this.data = null;
  }
}

// ❌ Memory leak: Closures holding references
function createBadProcessor() {
  const largeData = new Array(1000000).fill('data');

  return function process(input) {
    // This closure keeps largeData in memory even if not used
    return input.toUpperCase();
  };
}

// ✅ Good: Don't capture unnecessary variables
function createGoodProcessor() {
  return function process(input) {
    return input.toUpperCase();
  };
}

// Memory-efficient data processing
class StreamProcessor {
  static async processLargeFile(filePath, processLine) {
    const fs = require('fs');
    const readline = require('readline');

    const fileStream = fs.createReadStream(filePath);
    const rl = readline.createInterface({
      input: fileStream,
      crlfDelay: Infinity
    });

    let lineCount = 0;
    for await (const line of rl) {
      await processLine(line, lineCount++);

      // Yield control occasionally
      if (lineCount % 1000 === 0) {
        await new Promise(resolve => setImmediate(resolve));
      }
    }
  }

  // Object pooling for frequently created objects
  static createObjectPool(createFn, resetFn, size = 10) {
    const pool = [];

    for (let i = 0; i < size; i++) {
      pool.push(createFn());
    }

    return {
      acquire() {
        return pool.pop() || createFn();
      },

      release(obj) {
        if (pool.length < size) {
          resetFn(obj);
          pool.push(obj);
        }
      }
    };
  }
}

// Usage examples
const bufferPool = StreamProcessor.createObjectPool(
  () => Buffer.allocUnsafe(1024),
  (buffer) => buffer.fill(0),
  50
);

// Memory optimization tips implementation
class MemoryOptimizedCache {
  constructor(maxSize = 1000, ttl = 300000) { // 5 minutes
    this.cache = new Map();
    this.maxSize = maxSize;
    this.ttl = ttl;

    // Periodic cleanup
    setInterval(() => this.cleanup(), ttl / 2);
  }

  set(key, value) {
    // LRU eviction
    if (this.cache.size >= this.maxSize && !this.cache.has(key)) {
      const firstKey = this.cache.keys().next().value;
      this.cache.delete(firstKey);
    }

    this.cache.set(key, {
      value,
      timestamp: Date.now()
    });
  }

  get(key) {
    const item = this.cache.get(key);
    if (!item) return null;

    // Check TTL
    if (Date.now() - item.timestamp > this.ttl) {
      this.cache.delete(key);
      return null;
    }

    // Move to end (LRU)
    this.cache.delete(key);
    this.cache.set(key, item);

    return item.value;
  }

  cleanup() {
    const now = Date.now();
    for (const [key, item] of this.cache) {
      if (now - item.timestamp > this.ttl) {
        this.cache.delete(key);
      }
    }
  }
}

// Start monitoring
MemoryMonitor.startMonitoring();

module.exports = { MemoryMonitor, StreamProcessor, MemoryOptimizedCache };

Key Concepts:

  • Heap: Where objects are allocated
  • Stack: Function calls and local variables
  • Garbage Collection: Automatic memory cleanup
  • Memory Leaks: References preventing GC

Common Memory Issues:

  1. Event listeners not removed
  2. Closures holding references
  3. Global variables
  4. Circular references
  5. Large objects in memory

Databases and System Design

What is the difference between SQL and NoSQL databases?

SQL (Relational) Databases: Structured data with predefined schemas, ACID properties.

NoSQL Databases: Flexible schemas, designed for scalability and performance.

-- SQL Example (PostgreSQL)
-- Schema definition
CREATE TABLE users (
  id SERIAL PRIMARY KEY,
  email VARCHAR(255) UNIQUE NOT NULL,
  name VARCHAR(255) NOT NULL,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE posts (
  id SERIAL PRIMARY KEY,
  user_id INTEGER REFERENCES users(id),
  title VARCHAR(255) NOT NULL,
  content TEXT,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

CREATE TABLE comments (
  id SERIAL PRIMARY KEY,
  post_id INTEGER REFERENCES posts(id),
  user_id INTEGER REFERENCES users(id),
  content TEXT NOT NULL,
  created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

-- Complex joins
SELECT
  u.name,
  p.title,
  COUNT(c.id) as comment_count
FROM users u
LEFT JOIN posts p ON u.id = p.user_id
LEFT JOIN comments c ON p.id = c.post_id
WHERE u.created_at >= '2024-01-01'
GROUP BY u.id, u.name, p.id, p.title
HAVING COUNT(c.id) > 5
ORDER BY comment_count DESC;
// NoSQL Example (MongoDB)
// Flexible document structure
const userSchema = {
  _id: ObjectId,
  email: String,
  name: String,
  profile: {
    avatar: String,
    bio: String,
    preferences: {
      theme: String,
      notifications: Boolean
    }
  },
  posts: [{
    title: String,
    content: String,
    tags: [String],
    comments: [{
      author: {
        name: String,
        avatar: String
      },
      content: String,
      createdAt: Date,
      replies: [{
        author: String,
        content: String,
        createdAt: Date
      }]
    }],
    createdAt: Date
  }],
  createdAt: Date
};

// MongoDB queries
// Find users with posts containing specific tags
db.users.find({
  "posts.tags": { $in: ["javascript", "nodejs"] },
  "createdAt": { $gte: ISODate("2024-01-01") }
}, {
  name: 1,
  email: 1,
  "posts.$": 1  // Project only matching posts
});

// Aggregation pipeline
db.users.aggregate([
  { $match: { "posts.comments.2": { $exists: true } } }, // Users with posts having 3+ comments
  { $unwind: "$posts" },
  { $match: { "posts.comments.2": { $exists: true } } },
  {
    $group: {
      _id: "$_id",
      name: { $first: "$name" },
      totalComments: { $sum: { $size: "$posts.comments" } }
    }
  },
  { $sort: { totalComments: -1 } }
]);

Comparison:

Aspect SQL NoSQL
Schema Fixed, predefined Flexible, dynamic
Scalability Vertical (scale up) Horizontal (scale out)
ACID Full ACID compliance Eventually consistent
Joins Complex joins supported Limited join capabilities
Query Language Standardized SQL Database-specific
Use Cases Financial, CRM, ERP Big data, real-time, content management

NoSQL Types:

// 1. Document (MongoDB, CouchDB)
const documentStore = {
  _id: "user123",
  name: "John Doe",
  addresses: [
    { type: "home", street: "123 Main St", city: "NYC" },
    { type: "work", street: "456 Office Blvd", city: "NYC" }
  ]
};

// 2. Key-Value (Redis, DynamoDB)
const keyValueStore = {
  "user:123:profile": JSON.stringify({ name: "John", age: 30 }),
  "user:123:sessions": JSON.stringify(["session1", "session2"]),
  "cache:api:weather:nyc": JSON.stringify({ temp: 72, humidity: 60 })
};

// 3. Column-Family (Cassandra, HBase)
const columnFamily = {
  rowKey: "user123",
  columns: {
    "profile:name": "John Doe",
    "profile:email": "john@example.com",
    "activity:2024-01-15": "login",
    "activity:2024-01-16": "purchase"
  }
};

// 4. Graph (Neo4j, Amazon Neptune)
const graphData = {
  nodes: [
    { id: "user1", label: "User", properties: { name: "John" } },
    { id: "user2", label: "User", properties: { name: "Jane" } },
    { id: "post1", label: "Post", properties: { title: "Hello World" } }
  ],
  relationships: [
    { from: "user1", to: "user2", type: "FOLLOWS" },
    { from: "user1", to: "post1", type: "CREATED" },
    { from: "user2", to: "post1", type: "LIKED" }
  ]
};

How does indexing work in databases?

Database indexes are data structures that improve query performance by creating shortcuts to data.

-- SQL Indexing Examples
-- B-Tree Index (most common)
CREATE INDEX idx_users_email ON users(email);
CREATE INDEX idx_posts_user_id ON posts(user_id);

-- Composite Index
CREATE INDEX idx_posts_user_date ON posts(user_id, created_at);

-- Partial Index
CREATE INDEX idx_active_users ON users(email) WHERE active = true;

-- Unique Index
CREATE UNIQUE INDEX idx_users_email_unique ON users(email);

-- Functional Index
CREATE INDEX idx_users_email_lower ON users(LOWER(email));

-- Query execution with indexes
EXPLAIN ANALYZE SELECT * FROM users WHERE email = 'john@example.com';
-- Index Scan using idx_users_email (cost=0.28..8.30 rows=1)

EXPLAIN ANALYZE SELECT * FROM posts WHERE user_id = 123 ORDER BY created_at DESC;
-- Index Scan using idx_posts_user_date (cost=0.29..15.32 rows=10)
// MongoDB Indexing
// Single field index
db.users.createIndex({ email: 1 }); // 1 = ascending, -1 = descending

// Compound index
db.posts.createIndex({ user_id: 1, created_at: -1 });

// Text index for full-text search
db.posts.createIndex({ title: "text", content: "text" });

// Geospatial index
db.locations.createIndex({ coordinates: "2dsphere" });

// Partial index
db.users.createIndex(
  { email: 1 },
  { partialFilterExpression: { active: true } }
);

// TTL index (automatic expiration)
db.sessions.createIndex(
  { createdAt: 1 },
  { expireAfterSeconds: 3600 } // 1 hour
);

// Query performance analysis
db.users.find({ email: "john@example.com" }).explain("executionStats");

Index Implementation:

// Simplified B-Tree implementation for understanding
class BTreeNode {
  constructor(isLeaf = false) {
    this.keys = [];
    this.values = []; // For leaf nodes
    this.children = []; // For internal nodes
    this.isLeaf = isLeaf;
  }
}

class BTreeIndex {
  constructor(degree = 3) {
    this.root = new BTreeNode(true);
    this.degree = degree; // Minimum degree
  }

  search(key, node = this.root) {
    let i = 0;

    // Find the position where key might exist
    while (i < node.keys.length && key > node.keys[i]) {
      i++;
    }

    // If key found
    if (i < node.keys.length && key === node.keys[i]) {
      return node.isLeaf ? node.values[i] : this.search(key, node.children[i + 1]);
    }

    // If leaf node and key not found
    if (node.isLeaf) {
      return null;
    }

    // Recurse on appropriate child
    return this.search(key, node.children[i]);
  }

  insert(key, value) {
    // Implementation would handle node splits when full
    // This is a simplified version
    if (this.root.keys.length === (2 * this.degree) - 1) {
      const newRoot = new BTreeNode();
      newRoot.children.push(this.root);
      this.splitChild(newRoot, 0);
      this.root = newRoot;
    }

    this.insertNonFull(this.root, key, value);
  }

  // Range query support
  rangeQuery(startKey, endKey, node = this.root, results = []) {
    if (!node) return results;

    for (let i = 0; i < node.keys.length; i++) {
      if (node.keys[i] >= startKey && node.keys[i] <= endKey) {
        if (node.isLeaf) {
          results.push({ key: node.keys[i], value: node.values[i] });
        }
      }

      if (!node.isLeaf && node.keys[i] <= endKey) {
        this.rangeQuery(startKey, endKey, node.children[i], results);
      }
    }

    if (!node.isLeaf) {
      this.rangeQuery(startKey, endKey, node.children[node.keys.length], results);
    }

    return results;
  }
}

// Hash Index implementation
class HashIndex {
  constructor(size = 1000) {
    this.buckets = new Array(size).fill(null).map(() => []);
    this.size = size;
  }

  hash(key) {
    let hash = 0;
    for (let i = 0; i < key.length; i++) {
      const char = key.charCodeAt(i);
      hash = ((hash << 5) - hash) + char;
      hash = hash & hash; // Convert to 32-bit integer
    }
    return Math.abs(hash) % this.size;
  }

  insert(key, value) {
    const index = this.hash(key);
    const bucket = this.buckets[index];

    // Check if key already exists
    for (let i = 0; i < bucket.length; i++) {
      if (bucket[i].key === key) {
        bucket[i].value = value;
        return;
      }
    }

    bucket.push({ key, value });
  }

  search(key) {
    const index = this.hash(key);
    const bucket = this.buckets[index];

    for (const item of bucket) {
      if (item.key === key) {
        return item.value;
      }
    }

    return null;
  }
}

// Index usage analysis
class QueryOptimizer {
  static analyzeQuery(query, availableIndexes) {
    const analysis = {
      suggestedIndex: null,
      estimatedCost: Infinity,
      explanation: ''
    };

    // Simple rule-based optimization
    if (query.where) {
      for (const [field, condition] of Object.entries(query.where)) {
        const index = availableIndexes.find(idx =>
          idx.fields.includes(field) ||
          (idx.fields[0] === field && idx.type === 'btree')
        );

        if (index) {
          let cost = 1; // Base cost for index lookup

          if (condition.operator === 'range') {
            cost += Math.log2(index.cardinality); // B-tree range scan
          } else if (condition.operator === 'equality') {
            cost = 1; // Direct lookup
          }

          if (cost < analysis.estimatedCost) {
            analysis.suggestedIndex = index.name;
            analysis.estimatedCost = cost;
            analysis.explanation = `Use ${index.name} for efficient ${condition.operator} lookup on ${field}`;
          }
        }
      }
    }

    return analysis;
  }
}

Index Types & Use Cases:

  1. B-Tree: General purpose, range queries, sorting
  2. Hash: Equality lookups only, very fast
  3. Bitmap: Low cardinality columns, analytics
  4. GiST/GIN: Full-text search, arrays, JSON
  5. R-Tree: Geospatial data

Index Trade-offs:

  • Performance: Faster reads, slower writes
  • Storage: Additional space overhead
  • Maintenance: Kept in sync with data changes

What are ACID properties in a database?

ACID ensures database transactions maintain data integrity even in the presence of errors, power failures, or concurrent access.

// ACID Implementation Examples

// 1. ATOMICITY - All or nothing
class BankingService {
  async transferMoney(fromAccount, toAccount, amount) {
    const transaction = await db.beginTransaction();

    try {
      // Both operations must succeed or both fail
      const fromBalance = await db.query(
        'SELECT balance FROM accounts WHERE id = $1 FOR UPDATE',
        [fromAccount],
        { transaction }
      );

      if (fromBalance[0].balance < amount) {
        throw new Error('Insufficient funds');
      }

      await db.query(
        'UPDATE accounts SET balance = balance - $1 WHERE id = $2',
        [amount, fromAccount],
        { transaction }
      );

      await db.query(
        'UPDATE accounts SET balance = balance + $1 WHERE id = $2',
        [amount, toAccount],
        { transaction }
      );

      // Record transaction history
      await db.query(
        'INSERT INTO transactions (from_account, to_account, amount, type) VALUES ($1, $2, $3, $4)',
        [fromAccount, toAccount, amount, 'transfer'],
        { transaction }
      );

      await transaction.commit(); // All operations succeed
      return { success: true, transactionId: transaction.id };

    } catch (error) {
      await transaction.rollback(); // All operations fail
      throw error;
    }
  }
}

// 2. CONSISTENCY - Data integrity rules maintained
class UserRegistrationService {
  async createUser(userData) {
    const transaction = await db.beginTransaction();

    try {
      // Check constraints before inserting
      const existingUser = await db.query(
        'SELECT id FROM users WHERE email = $1',
        [userData.email],
        { transaction }
      );

      if (existingUser.length > 0) {
        throw new Error('Email already exists'); // Maintain uniqueness constraint
      }

      // Validate data integrity
      if (!this.isValidEmail(userData.email)) {
        throw new Error('Invalid email format');
      }

      if (userData.age < 13) {
        throw new Error('Age must be 13 or older'); // Business rule
      }

      const user = await db.query(
        'INSERT INTO users (email, name, age) VALUES ($1, $2, $3) RETURNING *',
        [userData.email, userData.name, userData.age],
        { transaction }
      );

      // Create associated profile (referential integrity)
      await db.query(
        'INSERT INTO user_profiles (user_id, created_at) VALUES ($1, $2)',
        [user[0].id, new Date()],
        { transaction }
      );

      await transaction.commit();
      return user[0];

    } catch (error) {
      await transaction.rollback();
      throw error;
    }
  }

  isValidEmail(email) {
    return /^[^\s@]+@[^\s@]+\.[^\s@]+$/.test(email);
  }
}

// 3. ISOLATION - Concurrent transactions don't interfere
class InventoryService {
  async purchaseItem(itemId, quantity, userId) {
    // Use appropriate isolation level
    const transaction = await db.beginTransaction({
      isolationLevel: 'READ_COMMITTED' // or SERIALIZABLE for strict isolation
    });

    try {
      // Lock row to prevent concurrent modifications
      const item = await db.query(
        'SELECT * FROM inventory WHERE id = $1 FOR UPDATE',
        [itemId],
        { transaction }
      );

      if (!item.length) {
        throw new Error('Item not found');
      }

      if (item[0].stock < quantity) {
        throw new Error('Insufficient stock');
      }

      // Update inventory
      await db.query(
        'UPDATE inventory SET stock = stock - $1, updated_at = NOW() WHERE id = $2',
        [quantity, itemId],
        { transaction }
      );

      // Create order
      const order = await db.query(
        'INSERT INTO orders (user_id, item_id, quantity, status) VALUES ($1, $2, $3, $4) RETURNING *',
        [userId, itemId, quantity, 'pending'],
        { transaction }
      );

      await transaction.commit();
      return order[0];

    } catch (error) {
      await transaction.rollback();
      throw error;
    }
  }

  // Demonstration of isolation levels
  async demonstrateIsolationLevels() {
    // READ UNCOMMITTED - Can see uncommitted changes (dirty reads)
    const readUncommitted = await db.beginTransaction({
      isolationLevel: 'READ_UNCOMMITTED'
    });

    // READ COMMITTED - Only see committed changes (default in most DBs)
    const readCommitted = await db.beginTransaction({
      isolationLevel: 'READ_COMMITTED'
    });

    // REPEATABLE READ - Same data throughout transaction
    const repeatableRead = await db.beginTransaction({
      isolationLevel: 'REPEATABLE_READ'
    });

    // SERIALIZABLE - Strongest isolation, prevents all anomalies
    const serializable = await db.beginTransaction({
      isolationLevel: 'SERIALIZABLE'
    });
  }
}

// 4. DURABILITY - Committed changes persist
class AuditLogger {
  async logCriticalAction(action, userId, details) {
    const transaction = await db.beginTransaction();

    try {
      // Write to audit log
      await db.query(
        'INSERT INTO audit_log (action, user_id, details, timestamp) VALUES ($1, $2, $3, $4)',
        [action, userId, JSON.stringify(details), new Date()],
        { transaction }
      );

      // Force write to disk (WAL - Write-Ahead Logging)
      await db.query('SELECT pg_switch_wal()', [], { transaction });

      await transaction.commit();

      // Additional durability measures
      await this.writeToSecondaryStorage(action, userId, details);

      return true;
    } catch (error) {
      await transaction.rollback();
      throw error;
    }
  }

  async writeToSecondaryStorage(action, userId, details) {
    // Write to file system, S3, or another database for extra durability
    const fs = require('fs').promises;
    const logEntry = {
      timestamp: new Date().toISOString(),
      action,
      userId,
      details
    };

    await fs.appendFile(
      '/var/log/critical-actions.log',
      JSON.stringify(logEntry) + '\n'
    );
  }
}

// Practical ACID implementation with connection pooling
class DatabaseManager {
  constructor() {
    this.pool = new Pool({
      host: 'localhost',
      database: 'myapp',
      user: 'postgres',
      password: 'password',
      max: 20, // Max connections
      idleTimeoutMillis: 30000,
      connectionTimeoutMillis: 2000,
    });
  }

  async executeTransaction(operations) {
    const client = await this.pool.connect();

    try {
      await client.query('BEGIN');

      const results = [];
      for (const operation of operations) {
        const result = await operation(client);
        results.push(result);
      }

      await client.query('COMMIT');
      return results;

    } catch (error) {
      await client.query('ROLLBACK');
      throw error;
    } finally {
      client.release();
    }
  }

  // Saga pattern for distributed transactions
  async executeSaga(steps) {
    const compensations = [];
    const results = [];

    try {
      for (const step of steps) {
        const result = await step.execute();
        results.push(result);

        if (step.compensate) {
          compensations.unshift(step.compensate); // LIFO order
        }
      }

      return results;

    } catch (error) {
      // Execute compensations in reverse order
      for (const compensate of compensations) {
        try {
          await compensate();
        } catch (compensateError) {
          console.error('Compensation failed:', compensateError);
        }
      }

      throw error;
    }
  }
}

// Usage example
const dbManager = new DatabaseManager();

async function complexBusinessOperation() {
  return await dbManager.executeTransaction([
    // Each operation receives the client connection
    async (client) => {
      return await client.query(
        'INSERT INTO orders (user_id, total) VALUES ($1, $2) RETURNING id',
        [userId, total]
      );
    },
    async (client) => {
      return await client.query(
        'UPDATE inventory SET stock = stock - $1 WHERE product_id = $2',
        [quantity, productId]
      );
    },
    async (client) => {
      return await client.query(
        'INSERT INTO order_items (order_id, product_id, quantity) VALUES ($1, $2, $3)',
        [orderId, productId, quantity]
      );
    }
  ]);
}

ACID in Different Database Types:

Traditional RDBMS (PostgreSQL, MySQL):

  • Full ACID compliance
  • Strong consistency
  • Complex transactions

NoSQL Approaches:

  • MongoDB: ACID for single documents, multi-document transactions available
  • Cassandra: Eventual consistency, tunable consistency levels
  • Redis: ACID for single operations, transactions with MULTI/EXEC

CAP Theorem Trade-offs:

  • Consistency: All nodes see the same data simultaneously
  • Availability: System remains operational
  • Partition Tolerance: System continues despite network failures

You can only guarantee 2 out of 3 properties.

How would you scale an API to handle millions of requests per second?

Scaling to millions of RPS requires multiple strategies across different layers.

// 1. Load Balancing & Traffic Distribution
const cluster = require('cluster');
const os = require('os');

if (cluster.isMaster) {
  // Master process - spawn workers
  const numCPUs = os.cpus().length;

  for (let i = 0; i < numCPUs; i++) {
    cluster.fork();
  }

  cluster.on('exit', (worker) => {
    console.log(`Worker ${worker.process.pid} died, restarting...`);
    cluster.fork();
  });
} else {
  // Worker process
  const express = require('express');
  const app = express();

  // Optimize Express for high performance
  app.set('trust proxy', true);
  app.disable('x-powered-by');

  // Health check endpoint
  app.get('/health', (req, res) => {
    res.json({ status: 'ok', pid: process.pid });
  });

  app.listen(3000, () => {
    console.log(`Worker ${process.pid} started on port 3000`);
  });
}

// 2. Caching Strategy Implementation
const Redis = require('redis');
const LRU = require('lru-cache');

class CachingService {
  constructor() {
    // Multi-layer caching
    this.l1Cache = new LRU({
      max: 10000,
      ttl: 1000 * 60 * 5 // 5 minutes
    });

    this.redis = Redis.createClient({
      host: 'redis-cluster',
      retryDelayOnFailover: 100,
      enableReadyCheck: false,
      lazyConnect: true
    });
  }

  async get(key) {
    // L1 Cache (in-memory)
    let value = this.l1Cache.get(key);
    if (value) return value;

    // L2 Cache (Redis)
    try {
      value = await this.redis.get(key);
      if (value) {
        this.l1Cache.set(key, JSON.parse(value));
        return JSON.parse(value);
      }
    } catch (error) {
      console.error('Redis error:', error);
    }

    return null;
  }

  async set(key, value, ttl = 3600) {
    // Set in both layers
    this.l1Cache.set(key, value);

    try {
      await this.redis.setex(key, ttl, JSON.stringify(value));
    } catch (error) {
      console.error('Redis set error:', error);
    }
  }

  // Cache warming for predictable traffic
  async warmCache(popularKeys) {
    const promises = popularKeys.map(async (key) => {
      const value = await this.fetchFromDatabase(key);
      if (value) {
        await this.set(key, value);
      }
    });

    await Promise.all(promises);
  }
}

// 3. Database Optimizations
class DatabasePool {
  constructor() {
    // Read replicas for scaling reads
    this.readPools = [
      new Pool({ host: 'read-replica-1', max: 50 }),
      new Pool({ host: 'read-replica-2', max: 50 }),
      new Pool({ host: 'read-replica-3', max: 50 })
    ];

    this.writePool = new Pool({
      host: 'master-db',
      max: 20,
      acquireTimeoutMillis: 3000
    });

    this.currentReadIndex = 0;
  }

  getReadConnection() {
    // Round-robin load balancing
    const pool = this.readPools[this.currentReadIndex];
    this.currentReadIndex = (this.currentReadIndex + 1) % this.readPools.length;
    return pool.connect();
  }

  getWriteConnection() {
    return this.writePool.connect();
  }

  // Connection health monitoring
  async healthCheck() {
    const checks = this.readPools.map(async (pool, index) => {
      try {
        const client = await pool.connect();
        await client.query('SELECT 1');
        client.release();
        return { replica: index, status: 'healthy' };
      } catch (error) {
        return { replica: index, status: 'unhealthy', error: error.message };
      }
    });

    return Promise.all(checks);
  }
}

// 4. Rate Limiting & Throttling
class RateLimiter {
  constructor(redisClient) {
    this.redis = redisClient;
  }

  // Sliding window rate limiter
  async checkRateLimit(key, limit, windowMs) {
    const now = Date.now();
    const pipeline = this.redis.pipeline();

    // Remove expired entries
    pipeline.zremrangebyscore(key, 0, now - windowMs);

    // Count current requests
    pipeline.zcard(key);

    // Add current request
    pipeline.zadd(key, now, `${now}-${Math.random()}`);

    // Set expiration
    pipeline.expire(key, Math.ceil(windowMs / 1000));

    const results = await pipeline.exec();
    const currentCount = results[1][1];

    return {
      allowed: currentCount < limit,
      remaining: Math.max(0, limit - currentCount - 1),
      resetTime: now + windowMs
    };
  }

  // Token bucket algorithm
  async tokenBucket(key, capacity, refillRate, tokensRequested = 1) {
    const script = `
      local key = KEYS[1]
      local capacity = tonumber(ARGV[1])
      local refillRate = tonumber(ARGV[2])
      local tokensRequested = tonumber(ARGV[3])
      local now = tonumber(ARGV[4])

      local bucket = redis.call('HMGET', key, 'tokens', 'lastRefill')
      local tokens = tonumber(bucket[1]) or capacity
      local lastRefill = tonumber(bucket[2]) or now

      -- Calculate tokens to add based on time passed
      local tokensToAdd = math.floor((now - lastRefill) / 1000 * refillRate)
      tokens = math.min(capacity, tokens + tokensToAdd)

      if tokens >= tokensRequested then
        tokens = tokens - tokensRequested
        redis.call('HMSET', key, 'tokens', tokens, 'lastRefill', now)
        redis.call('EXPIRE', key, 3600)
        return {1, tokens}
      else
        redis.call('HMSET', key, 'tokens', tokens, 'lastRefill', now)
        redis.call('EXPIRE', key, 3600)
        return {0, tokens}
      end
    `;

    const result = await this.redis.eval(
      script, 1, key, capacity, refillRate, tokensRequested, Date.now()
    );

    return {
      allowed: result[0] === 1,
      remainingTokens: result[1]
    };
  }
}

// 5. API Gateway Implementation
const express = require('express');
const httpProxy = require('http-proxy-middleware');

class APIGateway {
  constructor() {
    this.app = express();
    this.rateLimiter = new RateLimiter(redis);
    this.cache = new CachingService();
    this.setupMiddleware();
    this.setupRoutes();
  }

  setupMiddleware() {
    // Request logging
    this.app.use((req, res, next) => {
      req.startTime = Date.now();
      next();
    });

    // Rate limiting
    this.app.use(async (req, res, next) => {
      const clientId = req.ip;
      const rateLimitResult = await this.rateLimiter.checkRateLimit(
        `rate_limit:${clientId}`,
        1000, // 1000 requests
        60000 // per minute
      );

      if (!rateLimitResult.allowed) {
        return res.status(429).json({
          error: 'Rate limit exceeded',
          retryAfter: Math.ceil(rateLimitResult.resetTime / 1000)
        });
      }

      res.setHeader('X-RateLimit-Remaining', rateLimitResult.remaining);
      next();
    });

    // Response caching
    this.app.use(async (req, res, next) => {
      if (req.method === 'GET') {
        const cacheKey = `cache:${req.originalUrl}`;
        const cachedResponse = await this.cache.get(cacheKey);

        if (cachedResponse) {
          return res.json(cachedResponse);
        }

        // Override res.json to cache the response
        const originalJson = res.json;
        res.json = function(body) {
          this.cache.set(cacheKey, body, 300); // Cache for 5 minutes
          return originalJson.call(this, body);
        }.bind({ cache: this.cache });
      }

      next();
    });

    // Metrics collection
    this.app.use((req, res, next) => {
      res.on('finish', () => {
        const duration = Date.now() - req.startTime;
        // Send metrics to monitoring system
        this.recordMetrics({
          method: req.method,
          route: req.route?.path || req.path,
          statusCode: res.statusCode,
          duration,
          timestamp: Date.now()
        });
      });

      next();
    });
  }

  setupRoutes() {
    // Service discovery and load balancing
    const serviceRegistry = {
      'user-service': [
        'http://user-service-1:3001',
        'http://user-service-2:3001',
        'http://user-service-3:3001'
      ],
      'order-service': [
        'http://order-service-1:3002',
        'http://order-service-2:3002'
      ],
      'inventory-service': [
        'http://inventory-service-1:3003'
      ]
    };

    Object.entries(serviceRegistry).forEach(([serviceName, instances]) => {
      let currentIndex = 0;

      this.app.use(`/api/${serviceName}`, httpProxy({
        target: () => {
          const target = instances[currentIndex];
          currentIndex = (currentIndex + 1) % instances.length;
          return target;
        },
        changeOrigin: true,
        pathRewrite: { [`^/api/${serviceName}`]: '' },
        onError: (err, req, res) => {
          console.error(`Proxy error for ${serviceName}:`, err);
          res.status(503).json({ error: 'Service temporarily unavailable' });
        },
        timeout: 5000,
        proxyTimeout: 5000
      }));
    });
  }

  recordMetrics(metrics) {
    // Send to monitoring system (Prometheus, DataDog, etc.)
    console.log('Metrics:', metrics);
  }

  start(port = 8080) {
    this.app.listen(port, () => {
      console.log(`API Gateway running on port ${port}`);
    });
  }
}

// 6. Circuit Breaker Pattern
class CircuitBreaker {
  constructor(options = {}) {
    this.failureThreshold = options.failureThreshold || 5;
    this.resetTimeout = options.resetTimeout || 60000;
    this.state = 'CLOSED'; // CLOSED, OPEN, HALF_OPEN
    this.failureCount = 0;
    this.nextAttempt = Date.now();
  }

  async call(serviceCall) {
    if (this.state === 'OPEN') {
      if (Date.now() < this.nextAttempt) {
        throw new Error('Circuit breaker is OPEN');
      }
      this.state = 'HALF_OPEN';
    }

    try {
      const result = await serviceCall();
      this.onSuccess();
      return result;
    } catch (error) {
      this.onFailure();
      throw error;
    }
  }

  onSuccess() {
    this.failureCount = 0;
    this.state = 'CLOSED';
  }

  onFailure() {
    this.failureCount++;
    if (this.failureCount >= this.failureThreshold) {
      this.state = 'OPEN';
      this.nextAttempt = Date.now() + this.resetTimeout;
    }
  }
}

// 7. Message Queue for Async Processing
class MessageProcessor {
  constructor() {
    this.queues = new Map();
  }

  async addToQueue(queueName, message, priority = 0) {
    if (!this.queues.has(queueName)) {
      this.queues.set(queueName, []);
    }

    const queue = this.queues.get(queueName);
    queue.push({ message, priority, timestamp: Date.now() });

    // Sort by priority (higher number = higher priority)
    queue.sort((a, b) => b.priority - a.priority);
  }

  async processQueue(queueName, processingFunction, concurrency = 5) {
    const queue = this.queues.get(queueName);
    if (!queue || queue.length === 0) return;

    const batch = queue.splice(0, concurrency);
    const promises = batch.map(async (item) => {
      try {
        await processingFunction(item.message);
      } catch (error) {
        console.error('Failed to process message:', error);
        // Dead letter queue handling
        await this.addToDeadLetterQueue(queueName, item);
      }
    });

    await Promise.all(promises);

    // Continue processing if more items exist
    if (queue.length > 0) {
      setImmediate(() => this.processQueue(queueName, processingFunction, concurrency));
    }
  }
}

// Usage Example
const gateway = new APIGateway();
gateway.start(8080);

Scaling Strategies Summary:

  1. Horizontal Scaling: Add more servers
  2. Vertical Scaling: Increase server resources
  3. Caching: Multi-layer caching (CDN, Redis, in-memory)
  4. Database: Read replicas, sharding, connection pooling
  5. Load Balancing: Distribute traffic across instances
  6. Rate Limiting: Protect against abuse
  7. Circuit Breakers: Prevent cascade failures
  8. Async Processing: Queue heavy operations
  9. Monitoring: Real-time metrics and alerts
  10. Auto-scaling: Dynamic resource allocation

Performance Optimizations:

  • HTTP/2 and HTTP/3 support
  • Compression (gzip, brotli)
  • Connection pooling
  • Keep-alive connections
  • Minimize middleware overhead
  • Use CDN for static assets
  • Database query optimization
  • Implement proper indexing

This architecture can handle millions of requests per second through proper implementation of these patterns and continuous optimization based on monitoring data.


Data Structures and Algorithms (DSA)

Given an array, find the maximum sum of any contiguous subarray.

Kadane's Algorithm - O(n) time, O(1) space

function maxSubarraySum(nums) {
  if (!nums.length) return 0;

  let maxSoFar = nums[0];
  let maxEndingHere = nums[0];

  for (let i = 1; i < nums.length; i++) {
    maxEndingHere = Math.max(nums[i], maxEndingHere + nums[i]);
    maxSoFar = Math.max(maxSoFar, maxEndingHere);
  }

  return maxSoFar;
}

// Examples
console.log(maxSubarraySum([-2, 1, -3, 4, -1, 2, 1, -5, 4])); // 6
console.log(maxSubarraySum([5, 4, -1, 7, 8])); // 23

Complete technical interview guide covering JavaScript/ReactJS, NodeJS, Databases, System Design, and DSA with practical implementations and examples.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment