What are Distributed CACHES and how do they manage DATA CONSISTENCY?
What is a Cache?
Caching is a common technique used to reduce latency and minimize the load on backend resources like databases or external services.
Saves network calls
Avoid repeated computations
Reduce DB Load
Cache Policy -> LRU -> Least Recently Used -> LRU operates on the principle that when the cache is full and a new item needs to be added, the item that hasn't been used for the longest time is the one selected for removal.
Thrashing -> due to low-performing cache policy -> excessive evictions and replacements, leading to poor cache performance and inefficiency
Caching Strategies ->
Write-through cache, whenever data is modified or written to the cache, the same write operation is immediately performed on the corresponding location in the main memory or backing store. This ensures that the main memory is always up-to-date with the data in the cache.
Write-back cache, changes to data are initially made in the cache only. The main memory is updated only when the cache line holding the modified data is evicted from the cache. At that point, the updated data is written back to the main memory.
Comments
Post a Comment