tutorialsFeatured

Redis Caching Patterns: A Complete Guide

Learn essential Redis caching patterns including cache-aside, write-through, and write-behind strategies with code examples and best practices.

E
Eric Morris
April 3, 2026
rediscachingperformancepatterns
Redis Caching Patterns: A Complete Guide

Caching is one of the most effective ways to improve application performance. In this comprehensive guide, we'll explore the most important Redis caching patterns and when to use each one.

Why Caching Matters

Before diving into specific patterns, let's understand why caching is critical for modern applications:

  • Reduced latency: Cache hits return data in microseconds instead of milliseconds
  • Lower database load: Fewer queries to your primary database
  • Cost savings: Reduce expensive database reads
  • Better user experience: Faster page loads and API responses

Cache-Aside Pattern

The cache-aside (also called lazy loading) pattern is the most common caching strategy. The application code is responsible for loading data into the cache.

How It Works

  1. Application checks the cache for data
  2. If data exists (cache hit), return it
  3. If data doesn't exist (cache miss), fetch from database
  4. Store the data in cache for future requests
  5. Return the data to the application

Implementation

async function getUser(userId: string) {
  const cacheKey = `user:${userId}`;

  // Try to get from cache
  const cached = await redis.get(cacheKey);
  if (cached) {
    return JSON.parse(cached);
  }

  // Cache miss - fetch from database
  const user = await db.users.findById(userId);

  // Store in cache with 1 hour TTL
  await redis.setex(cacheKey, 3600, JSON.stringify(user));

  return user;
}

When to Use

  • Read-heavy workloads: When data is read far more often than written
  • Expensive queries: When database queries are slow or complex
  • Infrequent updates: When data doesn't change often

Pros and Cons

Pros:

  • Simple to implement
  • Cache only contains requested data
  • Failure-tolerant (cache failures don't break the app)

Cons:

  • Cache miss penalty (first request is slower)
  • Stale data possible if cache isn't invalidated
  • Requires cache invalidation logic

Write-Through Pattern

In the write-through pattern, data is written to the cache and database simultaneously.

How It Works

  1. Application writes data
  2. Data is written to cache first
  3. Cache writes data to database
  4. Confirm write to application

Implementation

async function updateUser(userId: string, data: UserUpdate) {
  const cacheKey = `user:${userId}`;

  // Update database
  const user = await db.users.update(userId, data);

  // Update cache immediately
  await redis.setex(cacheKey, 3600, JSON.stringify(user));

  return user;
}

When to Use

  • Write-heavy workloads: When data changes frequently
  • Consistency requirements: When cache must always be fresh
  • Simple data models: When updates are straightforward

Pros and Cons

Pros:

  • Cache is always up-to-date
  • No cache invalidation logic needed
  • Read latency is always low

Cons:

  • Write latency is higher
  • Cache contains unused data
  • More complex implementation

Write-Behind Pattern

The write-behind (write-back) pattern writes data to cache immediately and to the database asynchronously.

How It Works

  1. Application writes data to cache
  2. Cache immediately confirms
  3. Data is asynchronously written to database
  4. Handles batching and retries

Implementation

async function updateMetrics(userId: string, metrics: Metrics) {
  const cacheKey = `metrics:${userId}`;

  // Write to cache immediately
  await redis.lpush(cacheKey, JSON.stringify(metrics));

  // Queue for database write
  await queue.add('save-metrics', { userId, metrics });

  return { success: true };
}

// Background worker
queue.process('save-metrics', async (job) => {
  const { userId, metrics } = job.data;
  await db.metrics.insert({ userId, ...metrics });
});

When to Use

  • High write throughput: When you need to handle many writes
  • Batch operations: When writes can be batched
  • Analytics/metrics: When losing some writes is acceptable

Pros and Cons

Pros:

  • Very low write latency
  • Can batch writes for efficiency
  • Reduces database load

Cons:

  • Risk of data loss if cache fails
  • Eventual consistency only
  • Complex error handling

Best Practices

1. Set Appropriate TTLs

// Short TTL for frequently changing data
await redis.setex('prices:BTC', 60, price);

// Long TTL for stable data
await redis.setex('user:profile', 86400, profile);

// No TTL for permanent cache
await redis.set('config:app', config);

2. Handle Cache Stampede

When a popular cache key expires, many requests might hit the database simultaneously. Use locking to prevent this:

async function getUserSafe(userId: string) {
  const cacheKey = `user:${userId}`;
  const lockKey = `lock:${cacheKey}`;

  const cached = await redis.get(cacheKey);
  if (cached) return JSON.parse(cached);

  // Try to acquire lock
  const locked = await redis.set(lockKey, '1', 'EX', 10, 'NX');

  if (locked) {
    try {
      const user = await db.users.findById(userId);
      await redis.setex(cacheKey, 3600, JSON.stringify(user));
      return user;
    } finally {
      await redis.del(lockKey);
    }
  } else {
    // Wait and retry
    await sleep(100);
    return getUserSafe(userId);
  }
}

3. Monitor Cache Hit Rates

Track your cache effectiveness:

async function getCachedData(key: string) {
  const value = await redis.get(key);

  if (value) {
    await metrics.increment('cache.hits');
    return value;
  } else {
    await metrics.increment('cache.misses');
    return null;
  }
}

Conclusion

Choosing the right caching pattern depends on your specific use case:

  • Use cache-aside for most read-heavy applications
  • Use write-through when consistency is critical
  • Use write-behind for high write throughput

Start with cache-aside and evolve your strategy as your application grows. With SwiftCache, implementing any of these patterns is straightforward and reliable.

Ready to implement caching in your application? Start with SwiftCache today.

E

Eric Morris

Part of the SwiftCache engineering team, passionate about distributed systems and performance optimization.