Skip to main content

// simulator

Caching Strategies Simulator

Learn caching fundamentals with an interactive simulator. Visualize cache hits, misses, eviction policies (LRU, LFU, FIFO), and understand write strategies.

Supported byDigitalOceanDevDojoSMTPfastQuizAPIBecome a sponsor
Eviction Policy

LRU: Removes the item accessed longest ago

Cache Simulator
Tap to request data:

Green ring = in cache

Your App
-
-
-
-
Cache (1-10ms)
Database(50-200ms)
Request Log
$ Tap an item above...
Stats
0
Hits
0
Misses
0%
Hit Rate
0ms
Total Time
Cache (0/4)

Cache is empty

Why Use Caching?
Without Cache

Every request = slow DB call

With Cache

Repeat requests = instant!

The eviction policy decides what to remove when cache is full.

Understanding Caching Strategies

Eviction Policies

  • LRU (Least Recently Used): Evicts items not accessed recently. Most common in practice.
  • LFU (Least Frequently Used): Evicts items accessed least often. Great for identifying hot data.
  • FIFO (First In, First Out): Simple queue-based approach, evicts oldest items first.
  • TTL (Time To Live): Evicts items based on expiration time. Common for sessions.

Write Strategies

  • Write-Through: Writes to cache and database simultaneously. Strong consistency but higher latency.
  • Write-Back: Writes to cache first, async to database. Better performance but risk of data loss.
  • Write-Around: Writes directly to database, bypasses cache. Reduces cache pollution.

Key concepts

  • Hit rate: Percentage of requests served from cache (higher is better).
  • Cache size: Balance between memory usage and hit rate.
  • Hot data: Frequently accessed items that benefit most from caching.
  • Cache invalidation: One of the hardest problems in computer science.

Try next

Sponsored
Carbon Ads
$ cd /games
// share