2026-03-25 13:14:13
Redis is the de facto standard for application-layer caching, used by companies from startups to hyperscalers to reduce database load, cut API response times from hundreds of milliseconds to single digits, and absorb traffic spikes that would otherwise overwhelm backend infrastructure. But caching is deceptively subtle — the wrong strategy leads to stale data, cache stampedes, or a cache that provides no benefit at all.
This guide covers the essential Redis caching patterns for Node.js applications: when to use each one, how to implement it correctly, and how to avoid the common pitfalls that cause production incidents.
Redis is an in-memory data structure store. Its primary caching advantage over Memcached is richer data types (strings, hashes, lists, sets, sorted sets) and atomic operations. For caching in Node.js, the most important client is ioredis:
npm install ioredis
npm install -D @types/ioredis # For older versions; ioredis v5+ ships types
// src/lib/redis.ts
import Redis from 'ioredis';
import { env } from '../config/env';
// Singleton pattern — reuse connections
let redis: Redis | null = null;
export function getRedis(): Redis {
if (!redis) {
redis = new Redis({
host: env.REDIS_HOST,
port: env.REDIS_PORT,
password: env.REDIS_PASSWORD,
db: env.REDIS_DB,
// Connection resilience
retryStrategy: (times) => Math.min(times * 100, 3000),
maxRetriesPerRequest: 3,
enableReadyCheck: true,
lazyConnect: true,
// Prevent memory leaks from EventEmitter
maxListeners: 20,
});
redis.on('error', (err) => logger.error({ msg: 'Redis error', err }));
redis.on('connect', () => logger.info('Redis connected'));
}
return redis;
}
Cache-aside (also called lazy loading) is the most common caching pattern. The application code manages the cache explicitly: check the cache first, if miss fetch from the database and populate the cache, then return the result.
// The cache-aside pattern
async function getUserById(userId: string): Promise<User> {
const cacheKey = `user:${userId}`;
const redis = getRedis();
// 1. Check cache
const cached = await redis.get(cacheKey);
if (cached) {
return JSON.parse(cached) as User;
}
// 2. Cache miss — fetch from DB
const user = await db.user.findUnique({ where: { id: userId } });
if (!user) throw new NotFoundError('User', userId);
// 3. Populate cache with TTL
await redis.setex(cacheKey, 3600, JSON.stringify(user)); // TTL: 1 hour
return user;
}
Cache keys should be deterministic, descriptive, and namespaced to avoid collisions:
// Good key patterns:
'user:abc123' // Single resource
'user:abc123:profile' // Sub-resource
'posts:list:page:2:limit:20' // Paginated list
'posts:list:userId:abc123:page:1' // Filtered list
'search:query:hashofparams' // Search results (hash the params)
// Cache key builder helper
function cacheKey(...parts: (string | number)[]): string {
return parts.join(':');
}
// Version prefix — lets you invalidate all cache by incrementing version
const CACHE_VERSION = 'v1';
function versionedKey(...parts: (string | number)[]): string {
return `${CACHE_VERSION}:${parts.join(':')}`;
}
// src/lib/cache.ts
import { getRedis } from './redis';
export class CacheService {
private redis = getRedis();
async get<T>(key: string): Promise<T | null> {
const value = await this.redis.get(key);
return value ? (JSON.parse(value) as T) : null;
}
async set<T>(key: string, value: T, ttlSeconds: number): Promise<void> {
await this.redis.setex(key, ttlSeconds, JSON.stringify(value));
}
async del(key: string): Promise<void> {
await this.redis.del(key);
}
async getOrSet<T>(
key: string,
fetcher: () => Promise<T>,
ttlSeconds: number
): Promise<T> {
const cached = await this.get<T>(key);
if (cached !== null) return cached;
const value = await fetcher();
await this.set(key, value, ttlSeconds);
return value;
}
// Delete all keys matching a pattern (use carefully — scans whole keyspace)
async delByPattern(pattern: string): Promise<number> {
const keys = await this.redis.keys(pattern);
if (keys.length === 0) return 0;
return this.redis.del(...keys);
}
}
export const cache = new CacheService();
// Usage:
const user = await cache.getOrSet(
`user:${userId}`,
() => db.user.findUniqueOrThrow({ where: { id: userId } }),
3600
);
In write-through caching, every write to the database also updates the cache. This keeps the cache always consistent with the database, at the cost of write latency.
async function updateUser(userId: string, data: UpdateUserInput): Promise<User> {
// Write to DB
const user = await db.user.update({
where: { id: userId },
data,
omit: { password: true },
});
// Immediately update cache (write-through)
await cache.set(`user:${userId}`, user, 3600);
// Invalidate any list caches that included this user
await cache.del(`users:list`);
return user;
}
async function deleteUser(userId: string): Promise<void> {
await db.user.delete({ where: { id: userId } });
// Invalidate cache entry
await cache.del(`user:${userId}`);
await cache.del(`users:list`);
}
Write-through is better than cache-aside when you have multiple readers of a resource and need consistency. The trade-off: writes are slower because they must update both the database and the cache atomically (or accept brief inconsistency).
A cache stampede (also called "thundering herd") happens when a popular cache key expires and many concurrent requests simultaneously find a cache miss — all of them hit the database at once. On high-traffic sites, this can bring down your database.
import { Mutex } from 'async-mutex'; // npm install async-mutex
const mutexMap = new Map<string, Mutex>();
function getMutex(key: string): Mutex {
if (!mutexMap.has(key)) {
mutexMap.set(key, new Mutex());
}
return mutexMap.get(key)!;
}
async function getWithStampedeProtection<T>(
key: string,
fetcher: () => Promise<T>,
ttlSeconds: number
): Promise<T> {
// First check — no lock needed
const cached = await cache.get<T>(key);
if (cached !== null) return cached;
// Acquire mutex for this cache key
const mutex = getMutex(key);
const release = await mutex.acquire();
try {
// Double-check after acquiring lock (another request may have filled the cache)
const cachedAfterLock = await cache.get<T>(key);
if (cachedAfterLock !== null) return cachedAfterLock;
// We're the "winner" — fetch from DB and populate cache
const value = await fetcher();
await cache.set(key, value, ttlSeconds);
return value;
} finally {
release();
}
}
The XFetch algorithm proactively refreshes cache entries slightly before they expire, avoiding the stampede window entirely:
interface CacheEntry<T> {
value: T;
expiresAt: number; // Unix timestamp
delta: number; // Time it took to compute (ms)
}
async function xfetchGet<T>(
key: string,
fetcher: () => Promise<T>,
ttlSeconds: number,
beta = 1.0 // Higher = more eager recomputation
): Promise<T> {
const redis = getRedis();
const raw = await redis.get(key);
if (raw) {
const entry = JSON.parse(raw) as CacheEntry<T>;
const now = Date.now() / 1000;
const remaining = entry.expiresAt - now;
// Early recomputation probability increases as TTL decreases
const shouldRecompute = -entry.delta * beta * Math.log(Math.random()) >= remaining;
if (!shouldRecompute) return entry.value;
}
// Fetch and store with metadata
const start = Date.now();
const value = await fetcher();
const delta = (Date.now() - start) / 1000;
const entry: CacheEntry<T> = {
value,
expiresAt: Date.now() / 1000 + ttlSeconds,
delta,
};
await redis.setex(key, ttlSeconds + Math.ceil(delta * 5), JSON.stringify(entry));
return value;
}
"There are only two hard things in computer science: cache invalidation and naming things." — Phil Karlton
The simplest strategy: cache entries expire after a fixed time. Use when some staleness is acceptable:
// Product catalog: update every hour (acceptable staleness)
await cache.set(`product:${id}`, product, 3600);
// Stock prices: update every second (very low tolerance for staleness)
await cache.set(`price:${symbol}`, price, 1);
// User session: extend TTL on each access (sliding expiration)
async function getUserSession(sessionId: string) {
const redis = getRedis();
const key = `session:${sessionId}`;
const session = await cache.get(key);
if (session) {
await redis.expire(key, 1800); // Extend TTL on access
return session;
}
return null;
}
Invalidate cache entries when the underlying data changes, rather than waiting for TTL:
// After any write operation, invalidate related cache keys
class UserService {
async update(id: string, data: UpdateUserInput): Promise<User> {
const user = await db.user.update({ where: { id }, data });
await this.invalidateUserCache(id);
return user;
}
async invalidateUserCache(id: string): Promise<void> {
await Promise.all([
cache.del(`user:${id}`),
cache.del(`user:${id}:profile`),
cache.del(`user:${id}:posts`),
// Wildcard invalidation for list caches
cache.delByPattern(`users:list:*`),
]);
}
}
Tag-based invalidation lets you invalidate all cache entries associated with a resource:
async function setWithTags<T>(
key: string,
value: T,
ttl: number,
tags: string[]
): Promise<void> {
const redis = getRedis();
const pipeline = redis.pipeline();
// Store the value
pipeline.setex(key, ttl, JSON.stringify(value));
// Add the key to each tag set
tags.forEach((tag) => {
pipeline.sadd(`tag:${tag}`, key);
pipeline.expire(`tag:${tag}`, ttl + 60); // Tag lives slightly longer
});
await pipeline.exec();
}
async function invalidateTag(tag: string): Promise<void> {
const redis = getRedis();
const tagKey = `tag:${tag}`;
// Get all cache keys for this tag
const keys = await redis.smembers(tagKey);
if (keys.length === 0) return;
// Delete all tagged keys and the tag itself
await redis.del(...keys, tagKey);
}
// Usage:
await setWithTags(`post:${postId}`, post, 3600, [`post:${postId}`, `user:${post.userId}:posts`]);
// Invalidate all of a user's post caches at once
await invalidateTag(`user:${userId}:posts`);
// Cache complete HTTP responses for public, read-heavy endpoints
import { Request, Response, NextFunction } from 'express';
export function httpCache(ttlSeconds: number) {
return async (req: Request, res: Response, next: NextFunction) => {
// Only cache GET requests
if (req.method !== 'GET') return next();
const key = `http:${req.originalUrl}`;
const cached = await cache.get<{ body: unknown; headers: Record<string, string> }>(key);
if (cached) {
res.set(cached.headers);
res.set('X-Cache', 'HIT');
return res.json(cached.body);
}
// Intercept the response
const originalJson = res.json.bind(res);
res.json = (body: unknown) => {
res.set('X-Cache', 'MISS');
const headers = { 'Content-Type': 'application/json' };
cache.set(key, { body, headers }, ttlSeconds).catch(console.error);
return originalJson(body);
};
next();
};
}
// Apply to specific routes
router.get('/products', httpCache(300), productsController.list);
// Cache expensive aggregations
async function getDashboardStats(userId: string) {
const key = `dashboard:${userId}:stats`;
return cache.getOrSet(key, async () => {
// Expensive multi-table aggregation
const [postCount, commentCount, viewTotal] = await Promise.all([
db.post.count({ where: { authorId: userId } }),
db.comment.count({ where: { authorId: userId } }),
db.post.aggregate({
where: { authorId: userId },
_sum: { viewCount: true },
}),
]);
return {
postCount,
commentCount,
totalViews: viewTotal._sum.viewCount ?? 0,
};
}, 300); // 5-minute cache for dashboard stats
}
Track hit rate, latency, and memory usage to ensure your cache is actually helping:
// Wrap cache operations to emit metrics
class InstrumentedCache extends CacheService {
private hits = 0;
private misses = 0;
async get<T>(key: string): Promise<T | null> {
const value = await super.get<T>(key);
if (value !== null) {
this.hits++;
} else {
this.misses++;
}
return value;
}
getHitRate(): number {
const total = this.hits + this.misses;
return total === 0 ? 0 : this.hits / total;
}
}
// Log Redis INFO STATS periodically
async function logRedisMetrics() {
const redis = getRedis();
const info = await redis.info('stats');
// Parse keyspace_hits, keyspace_misses, instantaneous_ops_per_sec
logger.info({ type: 'redis_metrics', info });
}
KEYS pattern scans the entire keyspace and blocks Redis. Use SCAN instead for production key pattern scanning.// Graceful degradation when Redis is unavailable
async function getWithFallback<T>(
key: string,
fetcher: () => Promise<T>,
ttl: number
): Promise<T> {
try {
return await cache.getOrSet(key, fetcher, ttl);
} catch (redisError) {
logger.warn({ msg: 'Cache unavailable, falling through to DB', key, err: redisError });
return fetcher(); // Always returns data, just slower
}
}
Effective Redis caching can reduce database load by 90%+ and cut response times from hundreds of milliseconds to single digits. The key patterns to master are:
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.
2026-03-25 13:13:08
React Hooks, introduced in React 16.8, fundamentally changed how we write React components. They let you use state and other React features in functional components — no classes required. This guide covers every built-in hook with real examples, common patterns, and the pitfalls to avoid.
Before hooks, stateful logic was locked inside class components, which led to several problems: complex lifecycle methods that forced unrelated logic together, difficulty reusing stateful logic between components, and confusing this bindings. Hooks solve all of these by letting you extract stateful logic into reusable functions.
The most fundamental hook. useState adds state to a functional component and returns the current value plus a function to update it.
import { useState } from 'react';
function Counter() {
const [count, setCount] = useState(0); // initial value: 0
return (
<div>
<p>Count: {count}</p>
<button onClick={() => setCount(count + 1)}>+</button>
<button onClick={() => setCount(count - 1)}>-</button>
<button onClick={() => setCount(0)}>Reset</button>
</div>
);
}
When the new state depends on the previous state, use the functional form of the setter. This is important in async contexts where the state value might be stale.
// ❌ Potentially stale
setCount(count + 1);
// ✅ Always uses the latest state
setCount(prevCount => prevCount + 1);
// Practical example: toggling
const [isOpen, setIsOpen] = useState(false);
const toggle = () => setIsOpen(prev => !prev);
// Always replace, never mutate
const [user, setUser] = useState({ name: 'Alice', age: 30 });
// ❌ Wrong — mutates state directly
user.age = 31;
setUser(user);
// ✅ Correct — create a new object
setUser(prev => ({ ...prev, age: 31 }));
// Array state
const [items, setItems] = useState([]);
const addItem = (item) => setItems(prev => [...prev, item]);
const removeItem = (id) => setItems(prev => prev.filter(i => i.id !== id));
const updateItem = (id, changes) => setItems(prev =>
prev.map(i => i.id === id ? { ...i, ...changes } : i)
);
// If initial state is expensive to compute, pass a function
// The function only runs once on mount
const [data, setData] = useState(() => {
const stored = localStorage.getItem('data');
return stored ? JSON.parse(stored) : defaultData;
});
useEffect lets you perform side effects in functional components: data fetching, subscriptions, DOM manipulation, timers.
import { useState, useEffect } from 'react';
function UserProfile({ userId }) {
const [user, setUser] = useState(null);
const [loading, setLoading] = useState(true);
useEffect(() => {
// This runs after every render where userId changed
let cancelled = false;
async function fetchUser() {
setLoading(true);
try {
const response = await fetch(`/api/users/${userId}`);
const data = await response.json();
if (!cancelled) setUser(data);
} finally {
if (!cancelled) setLoading(false);
}
}
fetchUser();
// Cleanup function: runs when component unmounts or userId changes
return () => { cancelled = true; };
}, [userId]); // dependency array: re-run when userId changes
if (loading) return <p>Loading...</p>;
return <p>{user?.name}</p>;
}
// Run on every render (no dependency array)
useEffect(() => { console.log('rendered'); });
// Run only once on mount (empty dependency array)
useEffect(() => {
initAnalytics();
return () => cleanup();
}, []);
// Run when specific values change
useEffect(() => {
document.title = `${count} notifications`;
}, [count]);
// Cleanup patterns
useEffect(() => {
const subscription = subscribe(topic, handler);
return () => subscription.unsubscribe(); // runs on unmount
}, [topic]);
useEffect(() => {
const id = setInterval(tick, 1000);
return () => clearInterval(id);
}, []);
useEffect(() => {
window.addEventListener('resize', handleResize);
return () => window.removeEventListener('resize', handleResize);
}, []);
useContext lets you consume React context without nesting render props or consumer components.
import { createContext, useContext, useState } from 'react';
// 1. Create context
const ThemeContext = createContext('light');
// 2. Provide context
function App() {
const [theme, setTheme] = useState('light');
return (
<ThemeContext.Provider value={{ theme, setTheme }}>
<Main />
</ThemeContext.Provider>
);
}
// 3. Consume context anywhere in the tree
function ThemedButton() {
const { theme, setTheme } = useContext(ThemeContext);
const toggle = () => setTheme(t => t === 'light' ? 'dark' : 'light');
return (
<button
style={{ background: theme === 'light' ? '#fff' : '#333' }}
onClick={toggle}
>
Current: {theme}
</button>
);
}
// Custom hook for context (recommended pattern)
function useTheme() {
const context = useContext(ThemeContext);
if (!context) throw new Error('useTheme must be inside ThemeProvider');
return context;
}
useRef returns a mutable ref object that persists across renders without causing re-renders. Use it for DOM references and storing mutable values.
import { useRef, useEffect } from 'react';
function TextInput() {
const inputRef = useRef(null);
useEffect(() => {
inputRef.current.focus(); // auto-focus on mount
}, []);
return <input ref={inputRef} type="text" />;
}
// Store previous value
function usePrevious(value) {
const ref = useRef();
useEffect(() => { ref.current = value; });
return ref.current; // returns previous value before this render
}
// Store a mutable value that doesn't trigger re-renders
function Timer() {
const [isRunning, setIsRunning] = useState(false);
const intervalRef = useRef(null);
const start = () => {
setIsRunning(true);
intervalRef.current = setInterval(() => tick(), 1000);
};
const stop = () => {
setIsRunning(false);
clearInterval(intervalRef.current);
};
return (
<div>
{isRunning
? <button onClick={stop}>Stop</button>
: <button onClick={start}>Start</button>
}
</div>
);
}
useMemo memoizes an expensive computation, only recalculating when dependencies change.
import { useMemo } from 'react';
function ProductList({ products, filter, sortBy }) {
// Only recomputed when products, filter, or sortBy changes
const processedProducts = useMemo(() => {
return products
.filter(p => p.category === filter || filter === 'all')
.sort((a, b) => {
if (sortBy === 'price') return a.price - b.price;
if (sortBy === 'name') return a.name.localeCompare(b.name);
return 0;
});
}, [products, filter, sortBy]);
return (
<ul>
{processedProducts.map(p => (
<li key={p.id}>{p.name} — ${p.price}</li>
))}
</ul>
);
}
Don't overuse useMemo. The memoization itself has a cost. Only use it for genuinely expensive computations (complex data transformations, sorting large arrays) or when referential equality matters for child component re-renders.
useCallback memoizes a function itself. It's most useful when passing callbacks to optimized child components that rely on reference equality to skip re-renders.
import { useCallback, memo } from 'react';
// Child component wrapped in memo — only re-renders if props change
const Button = memo(({ onClick, children }) => {
console.log('Button rendered');
return <button onClick={onClick}>{children}</button>;
});
function Parent() {
const [count, setCount] = useState(0);
const [text, setText] = useState('');
// Without useCallback: new function reference on every render
// → Button re-renders every time text changes
// With useCallback: stable reference unless count changes
const handleClick = useCallback(() => {
setCount(prev => prev + 1);
}, []); // no dependencies — function never changes
return (
<div>
<input value={text} onChange={e => setText(e.target.value)} />
<Button onClick={handleClick}>Count: {count}</Button>
</div>
);
}
useReducer is an alternative to useState for complex state logic with multiple sub-values or when the next state depends on the previous state in non-trivial ways.
import { useReducer } from 'react';
const initialState = {
items: [],
total: 0,
loading: false,
error: null,
};
function cartReducer(state, action) {
switch (action.type) {
case 'ADD_ITEM':
return {
...state,
items: [...state.items, action.payload],
total: state.total + action.payload.price,
};
case 'REMOVE_ITEM':
const removed = state.items.find(i => i.id === action.payload);
return {
...state,
items: state.items.filter(i => i.id !== action.payload),
total: state.total - (removed?.price || 0),
};
case 'SET_LOADING':
return { ...state, loading: action.payload };
case 'SET_ERROR':
return { ...state, error: action.payload, loading: false };
case 'CLEAR':
return initialState;
default:
throw new Error(`Unknown action: ${action.type}`);
}
}
function Cart() {
const [state, dispatch] = useReducer(cartReducer, initialState);
const addToCart = (product) =>
dispatch({ type: 'ADD_ITEM', payload: product });
const removeFromCart = (id) =>
dispatch({ type: 'REMOVE_ITEM', payload: id });
return (
<div>
<h2>Cart ({state.items.length} items)</h2>
<p>Total: ${state.total.toFixed(2)}</p>
</div>
);
}
Custom hooks let you extract and reuse stateful logic. A custom hook is just a function that starts with use and calls other hooks.
// useFetch — data fetching with loading/error states
function useFetch(url) {
const [data, setData] = useState(null);
const [loading, setLoading] = useState(true);
const [error, setError] = useState(null);
useEffect(() => {
let cancelled = false;
setLoading(true);
setError(null);
fetch(url)
.then(res => {
if (!res.ok) throw new Error(`HTTP ${res.status}`);
return res.json();
})
.then(data => { if (!cancelled) setData(data); })
.catch(err => { if (!cancelled) setError(err.message); })
.finally(() => { if (!cancelled) setLoading(false); });
return () => { cancelled = true; };
}, [url]);
return { data, loading, error };
}
// Usage
function UserList() {
const { data: users, loading, error } = useFetch('/api/users');
if (loading) return <p>Loading...</p>;
if (error) return <p>Error: {error}</p>;
return <ul>{users.map(u => <li key={u.id}>{u.name}</li>)}</ul>;
}
// useLocalStorage — persist state to localStorage
function useLocalStorage(key, initialValue) {
const [value, setValue] = useState(() => {
try {
const item = localStorage.getItem(key);
return item ? JSON.parse(item) : initialValue;
} catch {
return initialValue;
}
});
const setStoredValue = (newValue) => {
setValue(newValue);
localStorage.setItem(key, JSON.stringify(newValue));
};
return [value, setStoredValue];
}
// useDebounce — debounce a value
function useDebounce(value, delay) {
const [debouncedValue, setDebouncedValue] = useState(value);
useEffect(() => {
const timer = setTimeout(() => setDebouncedValue(value), delay);
return () => clearTimeout(timer);
}, [value, delay]);
return debouncedValue;
}
// useWindowSize — track window dimensions
function useWindowSize() {
const [size, setSize] = useState({
width: window.innerWidth,
height: window.innerHeight,
});
useEffect(() => {
const handleResize = () => setSize({
width: window.innerWidth,
height: window.innerHeight,
});
window.addEventListener('resize', handleResize);
return () => window.removeEventListener('resize', handleResize);
}, []);
return size;
}
React enforces two rules (and the ESLint plugin eslint-plugin-react-hooks enforces them automatically):
// ❌ Wrong — conditional hook call
function Component({ condition }) {
if (condition) {
const [val, setVal] = useState(0); // breaks rules!
}
}
// ✅ Correct — condition inside the hook logic
function Component({ condition }) {
const [val, setVal] = useState(0);
const displayVal = condition ? val : null;
}
useState(initial) // local state
useEffect(fn, deps) // side effects, subscriptions
useContext(Context) // consume context
useRef(initialValue) // DOM refs, mutable values
useMemo(() => value, deps) // memoize expensive computation
useCallback(fn, deps) // memoize function reference
useReducer(reducer, init) // complex state logic
useId() // generate unique IDs (React 18)
useTransition() // mark updates as non-urgent (React 18)
useDeferredValue(value) // defer re-rendering (React 18)
useLayoutEffect(fn, deps) // like useEffect but synchronous
useImperativeHandle // customize ref values
useDebugValue // custom label in DevTools
useEffect correctly — but once they click, they make React development significantly more pleasant. Start with useState and useEffect, then add useContext and useRef as you need them. Build custom hooks for anything you find yourself copy-pasting between components.
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.
2026-03-25 13:11:47
Building a REST API with Node.js is straightforward. Building one that's maintainable at scale — with proper typing, input validation, authentication, error handling, and testability — takes significantly more thought. TypeScript eliminates whole categories of bugs that plague JavaScript APIs. Combined with modern tooling like Zod for runtime validation, Prisma for database access, and structured error handling, you get an API codebase that a team can confidently work in for years.
This guide walks through building a production-ready REST API from scratch. You can test the endpoints we build with our API Tester and format response payloads with the JSON Formatter.
mkdir my-api && cd my-api
npm init -y
# TypeScript + build tools
npm install -D typescript ts-node nodemon @types/node
npm install -D @types/express
# Core dependencies
npm install express
npm install zod # Runtime validation
npm install prisma @prisma/client # Database ORM
npm install jsonwebtoken @types/jsonwebtoken # JWT auth
npm install bcryptjs @types/bcryptjs # Password hashing
npm install helmet cors # Security middleware
npm install express-rate-limit # Rate limiting
npm install winston # Structured logging
npm install -D jest @types/jest ts-jest supertest @types/supertest # Testing
// tsconfig.json
{
"compilerOptions": {
"target": "ES2022",
"module": "CommonJS",
"lib": ["ES2022"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"noUncheckedIndexedAccess": true,
"exactOptionalPropertyTypes": true,
"noImplicitReturns": true,
"noFallthroughCasesInSwitch": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src"],
"exclude": ["node_modules", "dist"]
}
src/
├── app.ts # Express app setup (no server.listen)
├── server.ts # Entry point (server.listen)
├── config/
│ └── env.ts # Validated environment variables
├── middleware/
│ ├── auth.ts # JWT verification middleware
│ ├── errorHandler.ts # Central error handler
│ └── validate.ts # Zod validation middleware
├── modules/
│ └── users/
│ ├── users.router.ts
│ ├── users.controller.ts
│ ├── users.service.ts
│ └── users.schema.ts
├── lib/
│ ├── db.ts # Prisma client singleton
│ ├── logger.ts # Winston logger
│ └── errors.ts # Custom error classes
└── types/
└── express.d.ts # Express type augmentation
Never trust process.env — validate and type it at startup:
// src/config/env.ts
import { z } from 'zod';
const envSchema = z.object({
NODE_ENV: z.enum(['development', 'test', 'production']).default('development'),
PORT: z.coerce.number().int().positive().default(3000),
DATABASE_URL: z.string().url(),
JWT_SECRET: z.string().min(32, 'JWT_SECRET must be at least 32 characters'),
JWT_EXPIRES_IN: z.string().default('7d'),
BCRYPT_ROUNDS: z.coerce.number().int().min(10).max(14).default(12),
RATE_LIMIT_WINDOW_MS: z.coerce.number().default(15 * 60 * 1000), // 15 min
RATE_LIMIT_MAX: z.coerce.number().default(100),
CORS_ORIGIN: z.string().default('*'),
});
// Parse and validate — throws at startup if invalid
const parsed = envSchema.safeParse(process.env);
if (!parsed.success) {
console.error('❌ Invalid environment variables:');
console.error(parsed.error.flatten().fieldErrors);
process.exit(1);
}
export const env = parsed.data;
export type Env = z.infer<typeof envSchema>;
// src/lib/errors.ts
export class AppError extends Error {
constructor(
public message: string,
public statusCode: number,
public code: string,
public details?: unknown
) {
super(message);
this.name = 'AppError';
Error.captureStackTrace(this, this.constructor);
}
}
export class NotFoundError extends AppError {
constructor(resource: string, id?: string | number) {
super(
id ? `${resource} with id '${id}' not found` : `${resource} not found`,
404,
'NOT_FOUND'
);
}
}
export class UnauthorizedError extends AppError {
constructor(message = 'Authentication required') {
super(message, 401, 'UNAUTHORIZED');
}
}
export class ForbiddenError extends AppError {
constructor(message = 'Insufficient permissions') {
super(message, 403, 'FORBIDDEN');
}
}
export class ConflictError extends AppError {
constructor(message: string) {
super(message, 409, 'CONFLICT');
}
}
export class ValidationError extends AppError {
constructor(details: unknown) {
super('Validation failed', 422, 'VALIDATION_ERROR', details);
}
}
// src/middleware/validate.ts
import { Request, Response, NextFunction } from 'express';
import { AnyZodObject, ZodError, z } from 'zod';
import { ValidationError } from '../lib/errors';
type RequestSchema = {
body?: AnyZodObject;
query?: AnyZodObject;
params?: AnyZodObject;
};
export function validate(schema: RequestSchema) {
return async (req: Request, res: Response, next: NextFunction) => {
try {
if (schema.body) req.body = await schema.body.parseAsync(req.body);
if (schema.query) req.query = await schema.query.parseAsync(req.query);
if (schema.params) req.params = await schema.params.parseAsync(req.params);
next();
} catch (err) {
if (err instanceof ZodError) {
next(new ValidationError(err.flatten().fieldErrors));
} else {
next(err);
}
}
};
}
// src/types/express.d.ts — augment Express Request
import { User } from '@prisma/client';
declare global {
namespace Express {
interface Request {
user?: Omit<User, 'password'>;
}
}
}
// src/middleware/auth.ts
import { Request, Response, NextFunction } from 'express';
import jwt from 'jsonwebtoken';
import { env } from '../config/env';
import { UnauthorizedError } from '../lib/errors';
import { db } from '../lib/db';
interface JwtPayload {
sub: string; // user id
iat: number;
exp: number;
}
export async function requireAuth(req: Request, res: Response, next: NextFunction) {
try {
const authHeader = req.headers.authorization;
if (!authHeader?.startsWith('Bearer ')) {
throw new UnauthorizedError('Missing or invalid authorization header');
}
const token = authHeader.slice(7);
let payload: JwtPayload;
try {
payload = jwt.verify(token, env.JWT_SECRET) as JwtPayload;
} catch {
throw new UnauthorizedError('Invalid or expired token');
}
const user = await db.user.findUnique({
where: { id: payload.sub },
omit: { password: true },
});
if (!user) throw new UnauthorizedError('User not found');
req.user = user;
next();
} catch (err) {
next(err);
}
}
// Optional auth — attaches user if token present but doesn't require it
export async function optionalAuth(req: Request, res: Response, next: NextFunction) {
const authHeader = req.headers.authorization;
if (!authHeader?.startsWith('Bearer ')) return next();
try {
const token = authHeader.slice(7);
const payload = jwt.verify(token, env.JWT_SECRET) as JwtPayload;
const user = await db.user.findUnique({
where: { id: payload.sub },
omit: { password: true },
});
if (user) req.user = user;
} catch {
// Ignore invalid tokens for optional auth
}
next();
}
// src/modules/users/users.schema.ts
import { z } from 'zod';
export const createUserSchema = {
body: z.object({
email: z.string().email(),
password: z.string().min(8).max(100),
name: z.string().min(1).max(100).trim(),
}),
};
export const loginSchema = {
body: z.object({
email: z.string().email(),
password: z.string(),
}),
};
export const updateUserSchema = {
params: z.object({ id: z.string().cuid() }),
body: z.object({
name: z.string().min(1).max(100).trim().optional(),
email: z.string().email().optional(),
}).strict(), // Reject unknown fields
};
export type CreateUserInput = z.infer<typeof createUserSchema.body>;
export type LoginInput = z.infer<typeof loginSchema.body>;
export type UpdateUserInput = z.infer<typeof updateUserSchema.body>;
// src/modules/users/users.service.ts
import bcrypt from 'bcryptjs';
import jwt from 'jsonwebtoken';
import { db } from '../../lib/db';
import { env } from '../../config/env';
import { NotFoundError, ConflictError, UnauthorizedError } from '../../lib/errors';
import type { CreateUserInput, LoginInput, UpdateUserInput } from './users.schema';
export async function createUser(input: CreateUserInput) {
const existing = await db.user.findUnique({ where: { email: input.email } });
if (existing) throw new ConflictError('Email already registered');
const hashedPassword = await bcrypt.hash(input.password, env.BCRYPT_ROUNDS);
const user = await db.user.create({
data: { ...input, password: hashedPassword },
omit: { password: true },
});
return user;
}
export async function login(input: LoginInput) {
const user = await db.user.findUnique({ where: { email: input.email } });
if (!user) throw new UnauthorizedError('Invalid credentials');
const valid = await bcrypt.compare(input.password, user.password);
if (!valid) throw new UnauthorizedError('Invalid credentials');
const token = jwt.sign({ sub: user.id }, env.JWT_SECRET, {
expiresIn: env.JWT_EXPIRES_IN,
});
const { password: _, ...userWithoutPassword } = user;
return { user: userWithoutPassword, token };
}
export async function getUserById(id: string) {
const user = await db.user.findUnique({
where: { id },
omit: { password: true },
});
if (!user) throw new NotFoundError('User', id);
return user;
}
export async function updateUser(id: string, input: UpdateUserInput) {
await getUserById(id); // Ensures user exists, throws 404 if not
return db.user.update({
where: { id },
data: input,
omit: { password: true },
});
}
export async function deleteUser(id: string) {
await getUserById(id);
await db.user.delete({ where: { id } });
}
// src/modules/users/users.controller.ts
import { Request, Response } from 'express';
import * as usersService from './users.service';
export async function createUser(req: Request, res: Response) {
const user = await usersService.createUser(req.body);
res.status(201).json({ data: user });
}
export async function login(req: Request, res: Response) {
const result = await usersService.login(req.body);
res.json({ data: result });
}
export async function getMe(req: Request, res: Response) {
res.json({ data: req.user });
}
export async function getUserById(req: Request, res: Response) {
const user = await usersService.getUserById(req.params.id!);
res.json({ data: user });
}
export async function updateUser(req: Request, res: Response) {
const user = await usersService.updateUser(req.params.id!, req.body);
res.json({ data: user });
}
export async function deleteUser(req: Request, res: Response) {
await usersService.deleteUser(req.params.id!);
res.status(204).send();
}
// src/modules/users/users.router.ts
import { Router } from 'express';
import { validate } from '../../middleware/validate';
import { requireAuth } from '../../middleware/auth';
import * as usersController from './users.controller';
import { createUserSchema, loginSchema, updateUserSchema } from './users.schema';
const router = Router();
// Public routes
router.post('/register', validate(createUserSchema), usersController.createUser);
router.post('/login', validate(loginSchema), usersController.login);
// Protected routes
router.get('/me', requireAuth, usersController.getMe);
router.get('/:id', requireAuth, usersController.getUserById);
router.patch('/:id', requireAuth, validate(updateUserSchema), usersController.updateUser);
router.delete('/:id', requireAuth, usersController.deleteUser);
export { router as usersRouter };
// src/middleware/errorHandler.ts
import { Request, Response, NextFunction } from 'express';
import { AppError } from '../lib/errors';
import { logger } from '../lib/logger';
import { env } from '../config/env';
export function errorHandler(
err: Error,
req: Request,
res: Response,
next: NextFunction
) {
if (err instanceof AppError) {
// Known, intentional errors
if (err.statusCode >= 500) {
logger.error({ err, req: { method: req.method, url: req.url } });
}
res.status(err.statusCode).json({
error: {
code: err.code,
message: err.message,
...(err.details ? { details: err.details } : {}),
...(env.NODE_ENV === 'development' ? { stack: err.stack } : {}),
},
});
} else {
// Unknown errors — log everything, expose little
logger.error({ err, req: { method: req.method, url: req.url } });
res.status(500).json({
error: {
code: 'INTERNAL_SERVER_ERROR',
message: 'An unexpected error occurred',
...(env.NODE_ENV === 'development' ? { originalError: err.message } : {}),
},
});
}
}
Express 4 doesn't catch async errors automatically. Wrap every async route handler:
// src/lib/asyncHandler.ts
import { Request, Response, NextFunction, RequestHandler } from 'express';
type AsyncRequestHandler = (req: Request, res: Response, next: NextFunction) => Promise<void>;
export function asyncHandler(fn: AsyncRequestHandler): RequestHandler {
return (req, res, next) => {
fn(req, res, next).catch(next);
};
}
// Usage in router:
router.get('/:id', requireAuth, asyncHandler(usersController.getUserById));
// Note: Express 5 (currently in beta) handles async errors natively,
// making asyncHandler unnecessary. Consider using express@next.
// src/app.ts
import express from 'express';
import helmet from 'helmet';
import cors from 'cors';
import rateLimit from 'express-rate-limit';
import { env } from './config/env';
import { errorHandler } from './middleware/errorHandler';
import { usersRouter } from './modules/users/users.router';
import { logger } from './lib/logger';
export const app = express();
// Security middleware
app.use(helmet());
app.use(cors({ origin: env.CORS_ORIGIN, credentials: true }));
app.use(rateLimit({
windowMs: env.RATE_LIMIT_WINDOW_MS,
max: env.RATE_LIMIT_MAX,
standardHeaders: true,
legacyHeaders: false,
}));
// Body parsing
app.use(express.json({ limit: '10mb' }));
app.use(express.urlencoded({ extended: true }));
// Request logging
app.use((req, res, next) => {
const start = Date.now();
res.on('finish', () => {
logger.info({
method: req.method,
url: req.url,
status: res.statusCode,
duration: Date.now() - start,
});
});
next();
});
// Routes
app.get('/health', (req, res) => res.json({ status: 'ok', uptime: process.uptime() }));
app.use('/api/v1/users', usersRouter);
// 404 handler
app.use((req, res) => {
res.status(404).json({ error: { code: 'NOT_FOUND', message: 'Route not found' } });
});
// Central error handler (must be last)
app.use(errorHandler);
// src/modules/users/__tests__/users.test.ts
import request from 'supertest';
import { app } from '../../../app';
import { db } from '../../../lib/db';
beforeEach(async () => {
// Clean database before each test (use transactions for speed)
await db.user.deleteMany();
});
afterAll(async () => {
await db.$disconnect();
});
describe('POST /api/v1/users/register', () => {
it('creates a new user', async () => {
const res = await request(app)
.post('/api/v1/users/register')
.send({ email: '[email protected]', password: 'Password123!', name: 'Test User' });
expect(res.status).toBe(201);
expect(res.body.data).toMatchObject({
email: '[email protected]',
name: 'Test User',
});
expect(res.body.data.password).toBeUndefined(); // Never return password
});
it('returns 409 for duplicate email', async () => {
const user = { email: '[email protected]', password: 'Password123!', name: 'Test' };
await request(app).post('/api/v1/users/register').send(user);
const res = await request(app).post('/api/v1/users/register').send(user);
expect(res.status).toBe(409);
expect(res.body.error.code).toBe('CONFLICT');
});
it('validates email format', async () => {
const res = await request(app)
.post('/api/v1/users/register')
.send({ email: 'notanemail', password: 'Password123!', name: 'Test' });
expect(res.status).toBe(422);
expect(res.body.error.code).toBe('VALIDATION_ERROR');
});
});
password fields — use Prisma's omit or explicit select
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.
2026-03-25 13:10:45
Monorepos — single Git repositories containing multiple projects or packages — have become the default architecture at companies like Google, Meta, and Microsoft. In the JavaScript ecosystem, the rise of monorepos has spawned a competitive tooling landscape: Nx, Turborepo, and Lerna are the three most prominent options, each taking a different philosophy to solving the same core problem: making large codebases fast and maintainable.
This guide compares all three in depth — their architecture, caching strategies, task orchestration, ecosystem integrations, and the scenarios where each shines.
Before comparing tools, it's worth understanding what problem they solve. A monorepo containing 20 packages has 20 sets of dependencies, 20 build pipelines, and 20 test suites. Without tooling, npm install in the root, then building every package in sequence, would take 30+ minutes on any non-trivial codebase.
Monorepo tools solve this with three mechanisms:
Lerna was the first major JavaScript monorepo tool, released in 2015. It pioneered the concept of managing multiple npm packages in a single repository and automating versioning and publishing. For years, "monorepo" and "Lerna" were nearly synonymous in the JS community.
Lerna works on top of a package manager's workspace protocol (npm workspaces, yarn workspaces, or pnpm workspaces). It provides two key capabilities:
# lerna.json
{
"version": "independent",
"npmClient": "pnpm",
"useWorkspaces": true,
"packages": ["packages/*", "apps/*"],
"command": {
"publish": {
"conventionalCommits": true,
"message": "chore(release): publish"
}
}
}
# Run build across all changed packages
npx lerna run build --since origin/main
# Publish all changed packages
npx lerna publish --conventional-commits
In 2022, Nrwl (the company behind Nx) took over Lerna's maintenance and integrated Nx's task runner as Lerna's default engine. Modern Lerna (v6+) uses Nx under the hood for caching and task orchestration, meaning you get Lerna's publishing automation with Nx's performance.
# lerna.json — enable Nx integration
{
"version": "independent",
"useWorkspaces": true,
"$schema": "node_modules/lerna/schemas/lerna-schema.json",
"useNx": true, // Enable Nx task runner
"npmClient": "npm"
}
Lerna's sweet spot is open-source libraries that publish multiple packages to npm. If your primary concern is versioning coordination and publishing automation across packages (e.g., a component library shipping 20 independent packages), Lerna's conventional commits integration and changelog generation are best-in-class. For pure task running and caching, Turborepo or Nx are stronger.
Turborepo was built by Jared Palmer and acquired by Vercel in 2021. Its philosophy is radical simplicity: a minimal configuration file, zero opinions about your project structure, and an obsessive focus on making builds fast through caching.
Turborepo reads your turbo.json to understand task dependencies and caching rules, then orchestrates task execution with maximum parallelism:
// turbo.json
{
"$schema": "https://turbo.build/schema.json",
"tasks": {
"build": {
"dependsOn": ["^build"], // ^ means "all dependencies' build first"
"inputs": ["src/**", "package.json", "tsconfig.json"],
"outputs": ["dist/**", ".next/**"]
},
"test": {
"dependsOn": ["build"],
"inputs": ["src/**", "tests/**"],
"outputs": []
},
"lint": {
"inputs": ["src/**", ".eslintrc*"],
"outputs": []
},
"dev": {
"persistent": true,
"cache": false // Never cache dev servers
}
}
}
With this config, turbo build will:
Turborepo's killer feature is remote caching — sharing the build cache across your entire team and CI/CD pipeline. When developer A builds @acme/ui, the result is pushed to Vercel's remote cache. When developer B runs the same build with identical inputs, they get a cache hit and skip the build entirely.
# Link to Vercel remote cache
npx turbo login
npx turbo link
# Now every turbo run is potentially a cache hit from any team member's build
turbo build
# Output:
# @acme/ui:build: cache hit, replaying output...
# @acme/utils:build: cache hit, replaying output...
# @acme/web:build: >>> FULL TURBO (all dependencies were cached)
# Tasks: 12 successful, 12 total
# Cached: 12 cached, 12 total
# Time: 847ms >>> FULL TURBO
You can also self-host the remote cache using open-source alternatives like ducktape-cache or turborepo-remote-cache.
my-monorepo/
├── apps/
│ ├── web/ # Next.js app
│ └── docs/ # Docusaurus site
├── packages/
│ ├── ui/ # Shared React components
│ ├── config/ # Shared configs (eslint, tsconfig)
│ └── utils/ # Shared utilities
├── turbo.json
└── package.json # Workspace root with workspaces field
Turborepo is ideal for application-focused monorepos where speed and simplicity are the priorities. If you're building a suite of Next.js apps with shared packages and want to set up a monorepo in 30 minutes with excellent Vercel deployment integration, Turborepo is the natural choice. Its minimal configuration surface means less to learn and maintain.
Nx, developed by Nrwl, is the most feature-rich of the three tools. It goes well beyond task running to provide a full development platform: code generation, project graph visualization, automated migrations, and first-class integrations for React, Angular, Node.js, Next.js, and dozens of other frameworks.
Nx builds a project graph by analyzing your source code — not just package.json imports, but actual TypeScript/JavaScript import statements. This means it understands dependencies even in non-npm-linked scenarios:
# nx.json
{
"$schema": "./node_modules/nx/schemas/nx-schema.json",
"namedInputs": {
"default": ["{projectRoot}/**/*", "sharedGlobals"],
"production": [
"default",
"!{projectRoot}/**/*.spec.ts",
"!{projectRoot}/jest.config.ts"
]
},
"targetDefaults": {
"build": {
"dependsOn": ["^build"],
"inputs": ["production", "^production"],
"cache": true
},
"test": {
"inputs": ["default", "^production", "{workspaceRoot}/jest.preset.js"],
"cache": true
}
}
}
Run npx nx graph to open an interactive visualization of your entire monorepo's dependency graph. This is invaluable for understanding impact of changes:
# Show what's affected by changes since main branch
npx nx affected:graph
# Run only tests for affected projects
npx nx affected --target=test --base=main
# Run specific project's build
npx nx build my-app
# Run with task runner parallelism
npx nx run-many --target=build --all --parallel=4
Nx's code generators (formerly called "schematics") are one of its most powerful differentiators. They automate creating consistent new apps, libraries, and components:
# Add a new React app to the monorepo
npx nx g @nx/react:app my-new-app
# Add a shared library
npx nx g @nx/react:lib ui-components --directory=shared
# Add a component to a library
npx nx g @nx/react:component Button --project=shared-ui-components
# Generate a Node.js API
npx nx g @nx/node:app my-api --framework=express
Each generator creates files, updates imports, and registers the project in the workspace — ensuring consistent structure across your entire codebase without manual copy-pasting.
Nx Cloud (the paid cloud offering) provides remote caching similar to Turborepo, plus Distributed Task Execution (DTE) — a unique feature that distributes CI tasks across multiple agents in parallel:
// In CI: distribute 50 test suites across 10 agents
npx nx-cloud start-ci-run --distribute-on="5 linux-medium-js"
npx nx affected --target=test
npx nx-cloud stop-all-agents
This can reduce a 30-minute CI run to 5 minutes by running tasks on 6 parallel agents. The free tier of Nx Cloud covers most small-to-medium teams.
Nx is best for large enterprise monorepos, teams that want strong conventions enforced by tooling, and mixed-technology workspaces (e.g., Angular frontends + Node.js backends + shared TypeScript libraries). Its generator ecosystem and migration tools ("Nx migrate") make upgrading framework versions (React 18 → 19, Next.js 14 → 15) much less painful.
turbo.json with task definitions. New developers are productive in an hour.useNx: true. Standalone Lerna has basic task hashing.If you're converting an existing multi-repo setup to a monorepo, or migrating between tools:
turbo.json, define task pipeline, run turbo build. Usually done in an afternoon.npx nx@latest init in an existing monorepo — Nx's init command auto-detects projects and generates configuration. Full migration can take days for large repos.npx lerna init in a workspace root. Primary migration concern is configuring your versioning/publishing strategy.Choose Turborepo if:
Choose Nx if:
Choose Lerna if:
The monorepo tooling ecosystem has matured rapidly. Turborepo wins on simplicity and developer experience for application monorepos. Nx wins on features, scalability, and enterprise-grade tooling. Lerna wins specifically for publishing-focused open-source library repositories.
Notably, these tools aren't mutually exclusive — modern Lerna uses Nx internally, and you can run Turborepo alongside other tools. The most important choice is picking one and using it consistently across your team.
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.
2026-03-25 13:10:40
The MD5 vs SHA-256 debate has been mostly settled in security circles for years — but that doesn't stop the question from coming up constantly in real projects. Developers reach for MD5 out of habit, encounter it in legacy codebases, or genuinely wonder whether it's "good enough" for a non-security use case. This guide gives you a clear technical comparison, explains why MD5 is broken for security purposes, and maps out exactly when each algorithm — or a stronger alternative — belongs in your stack.
MD5 (Message-Digest Algorithm 5) was designed by Ron Rivest in 1991. It processes input in 512-bit blocks and produces a fixed 128-bit (16-byte) hash, typically displayed as a 32-character hex string.
// Node.js
const crypto = require('crypto');
const md5 = crypto.createHash('md5').update('hello').digest('hex');
console.log(md5); // '5d41402abc4b2a76b9719d911017c592'
The algorithm runs four rounds of bitwise operations over each block, mixing the data through 64 transformation steps. It was designed for speed, and it achieves it — MD5 can hash several gigabytes per second on modern hardware.
SHA-256 is a member of the SHA-2 family, standardised by NIST in 2001. It produces a 256-bit (32-byte) digest — displayed as a 64-character hex string — and processes input in 512-bit blocks through 64 rounds of compression using a mix of bitwise operations, modular additions, and logical functions drawn from the first 64 prime numbers.
// Node.js
const sha256 = crypto.createHash('sha256').update('hello').digest('hex');
console.log(sha256);
// '2cf24dba5fb0a30e26e83b2ac5b9e29e1b161e5c1fa7425e73043362938b9824'
The longer digest and more complex round function make SHA-256 significantly harder to reverse or find collisions in.
MD5 is cryptographically broken in two fundamental ways:
A collision is when two different inputs produce the same hash output. In 2004, researchers demonstrated practical MD5 collisions. By 2008, researchers had created a rogue Certificate Authority certificate by exploiting MD5 collisions in SSL certificates. Today, generating an MD5 collision takes seconds on consumer hardware using tools like HashClash.
// These two different inputs produce the same MD5 hash
// (real example from the 2004 Wang et al. paper — truncated for illustration)
// Input A: d131dd02c5e6eec4...
// Input B: d131dd02c5e6eec5...
// Both hash to: 79054025255fb1a26e4bc422aef54eb4
While full preimage attacks on MD5 aren't yet practical for arbitrary inputs, theoretical weaknesses have been demonstrated, and the small 128-bit output space means brute-force attacks are increasingly feasible with GPU clusters. Rainbow tables covering huge portions of common MD5 hashes exist publicly and can crack unsalted passwords in milliseconds.
As of 2026, no practical collision attacks against SHA-256 exist. NIST continues to recommend SHA-256 as a general-purpose cryptographic hash. SHA-256 provides 128 bits of collision resistance (birthday bound) and 256 bits of preimage resistance — both well beyond current computational capabilities.
MD5 is genuinely faster than SHA-256. Approximate throughput on a modern x86-64 CPU using hardware-optimised implementations:
On modern hardware with SHA-NI instructions (available on most x86-64 processors since ~2017), SHA-256 speed is competitive with MD5. The performance gap that once justified MD5 in non-security contexts has largely closed.
MD5 remains appropriate in a narrow set of non-security scenarios:
The key test: could an adversary benefit from a collision or preimage attack here? If yes, MD5 is off the table.
// HMAC-SHA256 for API signing
const hmac = crypto.createHmac('sha256', process.env.SECRET_KEY);
hmac.update(requestBody);
const signature = hmac.digest('hex');
SHA-512 produces a 512-bit digest and offers more security margin, but SHA-256 is sufficient for nearly all current applications. SHA-512 is faster than SHA-256 on 64-bit platforms due to its wider internal word size, making it a reasonable choice for hashing large files. Both are considered secure in 2026.
Use the DevPlaybook Hash Generator to compute MD5, SHA-256, SHA-512, and other digests side by side. It's useful for verifying file checksums, testing HMAC signatures, and understanding how digest length varies across algorithms.
Want these tools available offline? The DevToolkit Bundle ($9 on Gumroad) packages 40+ developer tools into a single downloadable kit — no internet required.
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.
2026-03-25 13:10:29
Asynchronous programming is at the heart of JavaScript. Whether you're fetching data from an API, reading files, or querying a database, your code needs to handle operations that take time without blocking the rest of the program. Promises and async/await are the modern, clean way to do this — and understanding them deeply will make you a significantly better JavaScript developer.
Before Promises, JavaScript used callbacks. They work fine for simple cases, but nesting them creates deeply indented, hard-to-read code:
// Callback hell — don't do this
getData(function(err, data) {
if (err) return handleError(err);
processData(data, function(err, result) {
if (err) return handleError(err);
saveResult(result, function(err, saved) {
if (err) return handleError(err);
notifyUser(saved, function(err) {
if (err) return handleError(err);
console.log('Done!');
});
});
});
});
Promises were introduced in ES2015 to solve this. Async/await, added in ES2017, is syntactic sugar on top of Promises that makes asynchronous code look synchronous.
A Promise is an object representing the eventual completion or failure of an asynchronous operation. It has three states:
Once a promise settles (fulfills or rejects), it cannot change state.
const myPromise = new Promise((resolve, reject) => {
// Simulate async work
setTimeout(() => {
const success = Math.random() > 0.5;
if (success) {
resolve('Operation succeeded!');
} else {
reject(new Error('Operation failed'));
}
}, 1000);
});
The function passed to new Promise() is called the executor. It runs immediately and receives two functions: resolve (call when done) and reject (call on failure).
myPromise
.then(result => {
console.log('Success:', result);
return result.toUpperCase(); // return value is passed to next .then()
})
.then(upperResult => {
console.log('Transformed:', upperResult);
})
.catch(error => {
console.error('Error:', error.message);
})
.finally(() => {
console.log('This runs regardless of success or failure');
});
Key rules for promise chaining:
.then() returns a new promise.then() callback becomes the resolved value of the next promise.then(), the chain waits for it.catch() at the end handles errors from any step in the chain// Fetching user data then their posts
fetch('/api/users/1')
.then(response => {
if (!response.ok) {
throw new Error(`HTTP error: ${response.status}`);
}
return response.json();
})
.then(user => {
console.log('User:', user.name);
return fetch(`/api/users/${user.id}/posts`);
})
.then(response => response.json())
.then(posts => {
console.log(`Found ${posts.length} posts`);
})
.catch(error => {
console.error('Failed:', error.message);
});
Async/await lets you write promise-based code that looks and reads like synchronous code. Under the hood, it's still promises — just with cleaner syntax.
Adding async before a function makes it return a Promise automatically. Even if you return a plain value, it gets wrapped in Promise.resolve().
async function greet() {
return 'Hello!';
}
// These are equivalent:
greet().then(msg => console.log(msg)); // 'Hello!'
Promise.resolve('Hello!').then(msg => console.log(msg));
await pauses execution of the async function until the promise settles. It can only be used inside an async function (or at the top level in ES modules).
async function fetchUser(id) {
const response = await fetch(`/api/users/${id}`);
if (!response.ok) {
throw new Error(`HTTP error: ${response.status}`);
}
const user = await response.json();
return user;
}
// Call it
fetchUser(1)
.then(user => console.log(user.name))
.catch(err => console.error(err.message));
// Or use await at the top level (ES modules / Node.js 14.8+)
const user = await fetchUser(1);
console.log(user.name);
The same chain from earlier, rewritten with async/await:
async function loadUserPosts(userId) {
const userResponse = await fetch(`/api/users/${userId}`);
if (!userResponse.ok) throw new Error(`HTTP error: ${userResponse.status}`);
const user = await userResponse.json();
console.log('User:', user.name);
const postsResponse = await fetch(`/api/users/${user.id}/posts`);
const posts = await postsResponse.json();
console.log(`Found ${posts.length} posts`);
return posts;
}
async function safeFetch(url) {
try {
const response = await fetch(url);
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
return await response.json();
} catch (error) {
if (error.name === 'TypeError') {
// Network error or invalid URL
console.error('Network error:', error.message);
} else {
// HTTP error
console.error('Request failed:', error.message);
}
throw error; // Re-throw so callers can handle it
}
}
// Using the function
async function main() {
try {
const data = await safeFetch('/api/data');
console.log(data);
} catch (error) {
// Handle at the call site
showErrorToUser(error.message);
}
}
When you need to run multiple async operations, JavaScript provides several static methods on the Promise class.
Runs all promises in parallel. Resolves when all of them succeed, or rejects immediately if any of them fail.
async function loadDashboard(userId) {
// All three requests run in parallel — much faster than sequential
const [user, posts, notifications] = await Promise.all([
fetch(`/api/users/${userId}`).then(r => r.json()),
fetch(`/api/users/${userId}/posts`).then(r => r.json()),
fetch(`/api/users/${userId}/notifications`).then(r => r.json()),
]);
return { user, posts, notifications };
}
// Without Promise.all (sequential — 3x slower)
async function loadDashboardSlow(userId) {
const user = await fetch(`/api/users/${userId}`).then(r => r.json());
const posts = await fetch(`/api/users/${userId}/posts`).then(r => r.json());
const notifications = await fetch(`/api/users/${userId}/notifications`).then(r => r.json());
return { user, posts, notifications };
}
Like Promise.all(), but waits for all promises to settle (succeed or fail) and never rejects. Returns an array of result objects.
const results = await Promise.allSettled([
fetch('/api/endpoint-1').then(r => r.json()),
fetch('/api/endpoint-2').then(r => r.json()),
fetch('/api/endpoint-that-might-fail').then(r => r.json()),
]);
results.forEach((result, index) => {
if (result.status === 'fulfilled') {
console.log(`Request ${index} succeeded:`, result.value);
} else {
console.error(`Request ${index} failed:`, result.reason.message);
}
});
Resolves or rejects as soon as the first promise settles. Useful for timeouts.
function fetchWithTimeout(url, timeoutMs) {
const fetchPromise = fetch(url).then(r => r.json());
const timeoutPromise = new Promise((_, reject) =>
setTimeout(() => reject(new Error('Request timed out')), timeoutMs)
);
return Promise.race([fetchPromise, timeoutPromise]);
}
try {
const data = await fetchWithTimeout('/api/slow-endpoint', 5000);
console.log(data);
} catch (err) {
console.error(err.message); // 'Request timed out' or fetch error
}
Resolves with the first fulfilled promise (ignores rejections unless all fail). Useful for trying multiple sources.
// Try multiple CDN mirrors, use the first that responds
const data = await Promise.any([
fetch('https://cdn1.example.com/data.json').then(r => r.json()),
fetch('https://cdn2.example.com/data.json').then(r => r.json()),
fetch('https://cdn3.example.com/data.json').then(r => r.json()),
]);
console.log('Got data from fastest source:', data);
const userIds = [1, 2, 3, 4, 5];
// ❌ Sequential — each request waits for the previous
for (const id of userIds) {
const user = await fetchUser(id); // waits 1s × 5 = 5s total
console.log(user.name);
}
// ✅ Parallel — all requests fire at once
const users = await Promise.all(userIds.map(id => fetchUser(id))); // ~1s total
users.forEach(user => console.log(user.name));
// ❌ Bug: not awaiting — data will be a Promise, not the value
async function getData() {
const data = fetchUser(1); // missing await!
console.log(data); // Promise { <pending> }
return data;
}
// ✅ Correct
async function getData() {
const data = await fetchUser(1);
console.log(data); // { id: 1, name: 'Alice', ... }
return data;
}
// ❌ Swallowing errors silently
async function bad() {
try {
return await riskyOperation();
} catch (err) {
// Silent! Caller doesn't know it failed
}
}
// ✅ Handle what you can, re-throw the rest
async function good() {
try {
return await riskyOperation();
} catch (err) {
logError(err); // log for observability
throw err; // re-throw so caller can react
}
}
// CommonJS or older environments without top-level await
(async () => {
try {
const data = await fetchSomething();
console.log(data);
} catch (err) {
console.error(err);
process.exit(1);
}
})();
const { promisify } = require('util');
const fs = require('fs');
const readFile = promisify(fs.readFile);
// Now you can use it with async/await
async function readConfig() {
const content = await readFile('./config.json', 'utf8');
return JSON.parse(content);
}
// Or write your own promisify wrapper
function delay(ms) {
return new Promise(resolve => setTimeout(resolve, ms));
}
async function main() {
console.log('Starting...');
await delay(1000);
console.log('1 second later');
}
For async generators and streams, for await...of lets you iterate over async iterables cleanly.
// Reading a stream line by line
async function processLargeFile(filename) {
const { createReadStream } = require('fs');
const { createInterface } = require('readline');
const stream = createReadStream(filename);
const rl = createInterface({ input: stream });
for await (const line of rl) {
processLine(line);
}
}
// Async generator
async function* generateSequence(start, end) {
for (let i = start; i <= end; i++) {
await delay(100); // simulate async work
yield i;
}
}
for await (const num of generateSequence(1, 5)) {
console.log(num); // 1, 2, 3, 4, 5 (with delays)
}
// Create
new Promise((resolve, reject) => { ... })
Promise.resolve(value)
Promise.reject(error)
// Consume
promise.then(onFulfilled, onRejected)
promise.catch(onRejected)
promise.finally(onFinally)
// Combinators
Promise.all([p1, p2, p3]) // all must succeed
Promise.allSettled([p1, p2, p3]) // wait for all, never rejects
Promise.race([p1, p2, p3]) // first to settle wins
Promise.any([p1, p2, p3]) // first fulfilled wins
// Async/await
async function fn() { ... } // returns a Promise
const result = await promise; // pause until resolved
try { await fn() } catch (err) {} // error handling
If you found this article helpful, check out DevToolkit — 40+ free browser-based developer tools with no signup required.
Popular tools: JSON Formatter · Regex Tester · JWT Decoder · Base64 Encoder
🛒 Get the DevToolkit Starter Kit on Gumroad — source code, deployment guide, and customization templates.