0% found this document useful (0 votes)
8 views

Cache

gggggg

Uploaded by

Nitish Vashisth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Cache

gggggg

Uploaded by

Nitish Vashisth
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Cache

Cache is storage like RAM for fast retrieval of data.

Terminology
 Capacity
 Cache Hit / Cache Miss
 Cache coherence
 Cache Invalidation and Updates
 TTL – Time to Live
Caching Best Practices
 Validity
 High Hit Rate
 Cache Miss
 TTL
Feature Estimation
1. Tera Byte
2. 5ok to 1M QPS (Query Per Second)
3. ~ 1 ms latency
4. Eviction Policy
5. 100 % Availability
6. Scalable
Cache Eviction Policies
 LRU – Least Recently Used
 LFU – Least Frequently Used
 FIFO
 LIFO
Type of Cache
 Write Through Cache
 Write Back
 Write Around
1. Write through

The application writes data to the cache, and the cache immediately writes the data
to the database.

This approach ensures data consistency between the cache and the database with
no data loss if the cache crashes.

The downside is the higher latency for write operations.

2. Write back

The application write data to the cache and the cache write asynchronously the data
to the database.

This is a good strategy for write-heavy workloads, since it ensure lower latencies
respect to the Write through. It also reduces the load on database and it’s more
tolerant to database failures.

The downside is that data can get lost in case of cache crash.

3. Write around

The application writes the data directly to the database. Only the data that is read
get into the cache.

This is a good strategy for data written once and read less frequently since the cache
store only re-reads data.

However, there is higher latency when reading recently written data because this
always gives a miss.

The best strategy depends on the specific use case.

For example, a write-back strategy is a good choice if you have a write-heavy


workload.

Instead, write-around could be a good fit if you have data written once and rarely
read (i.e., real-time logs).
Hash Table
Fault Tolerant
 Regular Interval Snapshot
 Log Reconstruction
Availability
 Node Replication
 Master-Master
 Master-Slave
LRU Cache
Size
Double Ended Queue
HashMap -> For Fast Retrieval
TTL

InMemory
Google’s https://github.com/google/guava
This was used in CXO apps for In Memory caching
https://github.com/google/guava/wiki/CachesExplained

Caching in Distribute System

Different Caching Solution


Redis, Memcache, hazlecast
Check Official Page of AWS and AZURE
https://aws.amazon.com/caching/

https://learn.microsoft.com/en-us/azure/architecture/best-
practices/caching
https://aws.amazon.com/caching/aws-caching/

Redis
How can Redis be used? There is more to Redis than just caching.

Redis can be used in a variety of scenarios as shown in the diagram.

🔹Session
We can use Redis to share user session data among different services.
🔹Cache
We can use Redis to cache objects or pages, especially for hotspot data.

🔹Distributed lock
We can use a Redis string to acquire locks among distributed services.

🔹Counter
We can count how many likes or how many reads for articles.

🔹Rate limiter
We can apply a rate limiter for certain user IPs.

🔹Global ID generator
We can use Redis Int for global ID.

🔹Shopping cart
We can use Redis Hash to represent key-value pairs in a shopping cart.

🔹Calculate user retention


We can use Bitmap to represent the user login daily and calculate user retention.

🔹Message queue
We can use List for a message queue.

🔹Ranking
We can use ZSet to sort the articles.
Low Level Design – Design LRU Cache

Memcached
Consistent Hashing

In Memory Cache
Distributed Cache

Reference
https://javachallengers.com/cache-systems-design/
TechWorld with Nana

https://www.youtube.com/watch?v=OqCK95AS-YE

You might also like