Caching

Caching strategies, cache invalidation patterns, and when to use different caching layers.

3 min read2026-03-22easycachingperformanceredis

What is Caching?

Caching stores frequently accessed data in a fast-access storage layer to reduce the load on slower backend systems. Think of it as a shortcut β€” instead of computing or fetching data every time, you retrieve it from a nearby, fast store.

Request β†’ Cache Hit? ──Yes──→ Return cached data (fast!)
                   β”‚
                  No
                   β”‚
                   β–Ό
         Fetch from DB β†’ Store in Cache β†’ Return data

The 80/20 Rule

In most applications, 20% of the data serves 80% of the requests. Caching that 20% can dramatically improve performance.

Caching Layers

From closest to the user to farthest:

  1. Browser Cache β€” Static assets, API responses
  2. CDN β€” Geographically distributed static content
  3. Application Cache β€” Redis/Memcached for computed results
  4. Database Cache β€” Query cache, buffer pool

Cache Strategies

Read Strategies

Write Strategies

Cache Invalidation

The Hardest Problem

Cache invalidation is notoriously difficult. Stale data can cause subtle bugs that are hard to diagnose.

Common Strategies

StrategyHow it WorksBest For
TTL (Time-To-Live)Data expires after set timeMostly-static data
Event-drivenInvalidate on write/update eventsReal-time systems
Version-basedAppend version to cache keyAPI responses

Cache Eviction Policies

When the cache is full, which items to remove?

  • LRU (Least Recently Used) β€” Most common, removes oldest-accessed item
  • LFU (Least Frequently Used) β€” Removes least-accessed item
  • FIFO (First In, First Out) β€” Removes oldest item regardless of access
  • Random β€” Surprisingly effective in some workloads

Default Choice

When in doubt, use LRU. It works well for most real-world access patterns and is the default in Redis.

Redis vs Memcached

FeatureRedisMemcached
Data structuresRich (strings, lists, sets, hashes)Simple key-value
PersistenceYes (RDB, AOF)No
ReplicationYesNo
Memory efficiencyLowerHigher
Best forFeature-rich caching, pub/subSimple, high-throughput caching

Cache Thundering Herd

When a popular cache key expires, many requests simultaneously hit the database:

Thundering Herd Problem

If 1000 requests arrive for an expired key, all 1000 will query the database simultaneously, potentially causing a cascade failure.

Solutions:

  1. Locking: Only one request fetches from DB, others wait
  2. Stale-while-revalidate: Serve stale data while refreshing in background
  3. Pre-warming: Refresh cache before TTL expires

Comments