DEV Community

Cover image for Unlocking Performance: The Power of Caching with Redis
Srishti Prasad
Srishti Prasad

Posted on

Unlocking Performance: The Power of Caching with Redis

In this exploration πŸš€, we'll venture beyond the surface 🌍, delving into the fundamental principles, theories, and concepts that underpin caching & Redis. Whether you're a seasoned enthusiast seeking deeper insights or a newcomer eager to expand your knowledge ⚑.
But before we embark on the practical implementation, it's crucial to lay a solid foundation by understanding the fundamental conceptsπŸ’₯

What is caching

Caching plays a crucial role in addressing real-world problems across various domains, especially in the realm of technology and information systems. One common problem that caching can solve is the optimisation of web performance.

In day-to-day life, think about your experience when browsing the internet. When you visit a frequently accessed website, such as a news site or social media platform, you expect it to load quickly. However, if every request you make to that website had to retrieve all its data from the server each time, the loading times would be significantly longer, especially during peak usage periods.

This is where caching comes in. Websites often employ caching mechanisms to store frequently accessed data. When you revisit the website or navigate to a different page within it, instead of fetching all the data from the server again, the browser or server can quickly retrieve the cached data, resulting in faster load times.

Real life analogy of caching

It's like having your favourite snacks on the kitchen counter instead of buried in the pantry. Just like how you can quickly grab the snacks you love without searching too hard, caching keeps frequently used data easily accessible for faster use, making things run smoothly, whether it's finding your favourite book or loading a webpage.

Image description

Where is caching used?

First of all the question arises, why don’t we store all our data in the cache memory?
The reason is

  • the hardware required for the cache memory is much more expensive than that of a database.
  • finding something particular from all that data will be time-consuming if a lot of information is stored in the cache memory. So, according to me ,this will defeat the purpose of using the cache memory.

Thus we know that caching cannot be used to store all our data and caching is explicitly used in some places.

What is redis ?

Redis, which stands for Remote Dictionary Server, is an open-source, in-memory data structure store, used as a database, cache, and message broker. It supports various data structures such as strings, hashes, lists, sets, sorted sets with range queries, bitmaps, hyperloglogs, geospatial indexes, and streams.
Redis is often used as a caching solution for frequently accessed data in applications.

In-memory data structure refers to the storage of data primarily in the main memory (RAM) of a computer, rather than on disk or other storage media.
This approach offers several advantages in terms of performance and access speed:

  • Fast Access: Data stored in memory can be accessed much more quickly than data stored on disk or in other types of storage. This results in lower latency and faster response times for applications that rely on this data.

  • High Throughput: In-memory data structures enable high throughput for read and write operations, making them well-suited for applications with demanding performance requirements.

  • Cache Optimization: In-memory data structures are commonly used for caching frequently accessed data, which helps reduce the load on backend databases and improves overall system performance.

In-memory data structures include: Arrays, Linked Lists, Hash Tables, Trees, Graphs, stacks and queues

Redis Key-Value Store Design for Caching

  • Redis uses a key-value store design system for caching. In this system, data is stored as key-value pairs, where each piece of data (value) is associated with a unique identifier (key).

  • Developers can easily store various types of data in Redis, such as strings, lists, sets, sorted sets, and hashes, each identified by a unique key.

  • Additionally, Redis provides various commands and data structures optimized for efficient key-based operations, enabling fast retrieval and manipulation of cached data.

  • So, hence the key-value store design system employed by Redis for caching offers a straightforward and versatile approach to data storage and retrieval, making it well-suited for caching applications where speed, simplicity, and flexibility are essential.

Eviction policy

Eviction policy is one of the most important concept that we need to know in order to understand caching.

It is a strategy employed by caching systems, including Redis, to manage the removal of items from the cache when the cache reaches its capacity limit. When the cache is full and a new item needs to be added, the eviction policy determines which existing items to remove from the cache to make space for the new item.

Redis has several eviction policy:

Redis does not automatically remove any items from the cache. When the cache reaches its capacity limit and a new item needs to be added, Redis returns an error indicating that the operation cannot be performed.

1).Eviction with LRU (Least Recently Used) - Redis removes the least recently used items from the cache when it reaches its capacity limit. This policy ensures that the items accessed least recently are evicted first.

Image description

2).Eviction with LFU (Least Frequently Used) - Redis removes the least frequently used items from the cache when it reaches its capacity limit. This policy ensures that the items accessed least frequently are evicted first.

Image description
3).Eviction with Most Recently Used (MRU) - To understand this, let us take the example of Facebook. Suppose someone sends you a friend request on Facebook, but you don’t know that person, so you will decline the request. Now Facebook also gives suggestions of people you may know, so here they are not going to suggest that same person because you claimed not to know them.
In this situation, the most recently used cache is cleared to avoid suggesting the same person.

In this series ,I'll be covering various other aspects of redis its implementation and many more.
Please hit like ❀️, if you find this article helpful
Do reach out to me in comment section for any query.

Top comments (0)