Redis Caching Strategies

backend
TypeScript
optimization
strict_senior
Remix

Implement effective caching patterns with Redis for performance optimization.

12/8/2025

Prompt

Redis Caching Implementation

Implement Redis caching for [Application] to improve performance and reduce database load.

Requirements

1. Caching Strategy

Choose and implement appropriate patterns:

  • Cache-Aside (Lazy Loading) - For read-heavy data
  • Write-Through - For data that must stay synchronized
  • Write-Behind - For high-write scenarios

2. Cache Structure

Implement caching for:

  • [Data Type 1] - Individual records (e.g., user profiles)
  • [Data Type 2] - Collections/lists (e.g., recent items)
  • [Data Type 3] - Sets (e.g., followers, tags)
  • [Data Type 4] - Sorted sets (e.g., leaderboards)

3. Key Patterns to Implement

Single Value Caching

  • Use namespaced keys: entity:id format
  • Set appropriate TTL (Time To Live)
  • Handle cache misses gracefully
  • Serialize complex objects to JSON

List Caching

  • Use Redis Lists for ordered collections
  • Implement pagination
  • Limit list size with LTRIM
  • Use for recent items, activity feeds

Set Operations

  • Use Redis Sets for unique collections
  • Implement follower/following relationships
  • Tag systems
  • Set intersection/union for recommendations

Rate Limiting

  • Implement per-user rate limits
  • Use INCR with EXPIRE
  • Sliding window rate limiting
  • Different limits for different endpoints

4. TTL (Expiration) Strategy

Set expiration times based on data volatility:

  • Frequently changing data: [Short TTL] (e.g., 5-15 minutes)
  • Stable data: [Medium TTL] (e.g., 1-6 hours)
  • Rarely changing data: [Long TTL] (e.g., 24 hours)

5. Cache Invalidation

Implement:

  • Single key deletion on updates
  • Pattern-based deletion for related data
  • Tag-based invalidation for grouped data
  • Event-driven invalidation on data changes

6. Performance Optimization

Include:

  • Pipeline multiple operations
  • Use connection pooling
  • Monitor cache hit/miss ratios
  • Set memory limits and eviction policies

Implementation Pattern

import Redis from 'ioredis'

const redis = new Redis(process.env.REDIS_URL)

// Cache-aside pattern
async function get[Entity](id: string) {
  const cacheKey = `[entity]:${id}`
  
  // Try cache first
  const cached = await redis.get(cacheKey)
  if (cached) return JSON.parse(cached)
  
  // Cache miss - fetch from database
  const data = await db.[entity].findUnique({ where: { id } })
  
  // Store in cache
  await redis.setex(cacheKey, [TTL_SECONDS], JSON.stringify(data))
  
  return data
}

// Rate limiting
async function checkRateLimit(userId: string): Promise<boolean> {
  const key = `ratelimit:${userId}`
  const count = await redis.incr(key)
  
  if (count === 1) {
    await redis.expire(key, [WINDOW_SECONDS])
  }
  
  return count <= [MAX_REQUESTS]
}

// Cache invalidation
async function invalidate[Entity](id: string) {
  await redis.del(`[entity]:${id}`)
  // Invalidate related caches if needed
}

Best Practices

  • Always handle cache failures gracefully
  • Use consistent key naming conventions
  • Monitor cache hit rates
  • Implement cache warming for critical data
  • Use Redis transactions for atomic operations
  • Set appropriate max memory and eviction policies

Tags

redis
caching
performance
optimization

Tested Models

gpt-4
claude-3-5-sonnet

Comments (0)

Sign in to leave a comment

Sign In