C++Program to Implement LRU Cache

Introduction

In the world of computing, efficient data retrieval is a critical aspect of performance. One widely used technique to optimize data retrieval is through the use of caching mechanisms. One such mechanism is the Least Recently Used (LRU) Cache. In this article, we will explore how to implement an LRU Cache in C++ with detailed explanations and practical examples. By the end of this article, you will have a clear understanding of how LRU Cache works and how to implement it in C++.

Prerequisites

Before diving into the implementation of the LRU Cache, it is important to have a basic understanding of the following concepts:

  • Data Structures: Familiarity with linked lists, hash tables, and their operations.
  • C++ Programming: Understanding of classes, templates, and the Standard Template Library (STL).
  • Cache Mechanisms: Basic knowledge of caching and its importance in computing.

Ensure you are comfortable with these concepts to fully grasp the implementation of the LRU Cache.

1. Understanding LRU Cache

1.1 What is LRU Cache?

An LRU Cache is a type of cache that removes the least recently used items first when the cache reaches its capacity. It maintains the order of usage of items to efficiently manage the cache’s contents. This makes it ideal for scenarios where you need to access frequently used data quickly.

1.2 Why Use LRU Cache?

LRU Cache helps in improving the performance of applications by keeping frequently accessed data readily available, thus reducing the time needed for data retrieval. It is particularly useful in systems with limited memory where managing and prioritizing data access is crucial.

1.3 How LRU Cache Works

  1. Insertion: When a new item is inserted, it is added to the front of the cache.
  2. Access: When an item is accessed, it is moved to the front to mark it as recently used.
  3. Eviction: When the cache reaches its capacity, the item at the back (least recently used) is removed.

2. Implementing LRU Cache in C++

Let’s go through three different examples to understand the implementation and usage of the LRU Cache in C++.

2.1 Basic Implementation

In this example, we will implement the basic structure of an LRU Cache using a combination of a doubly linked list and a hash table.

Code

C++
#include <iostream>
#include <unordered_map>
#include <list>
using namespace std;

class LRUCache {
private:
    int capacity;
    list<int> cache;
    unordered_map<int, pair<int, list<int>::iterator>> map;

public:
    LRUCache(int capacity) : capacity(capacity) {}

    int get(int key) {
        if (map.find(key) == map.end()) {
            return -1;
        } else {
            cache.erase(map[key].second);
            cache.push_front(key);
            map[key].second = cache.begin();
            return map[key].first;
        }
    }

    void put(int key, int value) {
        if (map.find(key) != map.end()) {
            cache.erase(map[key].second);
        } else if (cache.size() >= capacity) {
            int lru = cache.back();
            cache.pop_back();
            map.erase(lru);
        }
        cache.push_front(key);
        map[key] = {value, cache.begin()};
    }

    void display() {
        for (int key : cache) {
            cout << key << ":" << map[key].first << " ";
        }
        cout << endl;
    }
};

int main() {
    LRUCache cache(3);
    cache.put(1, 1);
    cache.put(2, 2);
    cache.put(3, 3);
    cache.display();
    cache.put(4, 4);
    cache.display();
    cout << cache.get(2) << endl;
    cache.display();
    cache.put(5, 5);
    cache.display();
    return 0;
}

Output

C++
3:3 2:2 1:1 
4:4 3:3 2:2 
2
2:2 4:4 3:3 
5:5 2:2 4:4 

Explanation

In this example, we implement the basic structure of an LRU Cache using a doubly linked list to store keys in order of usage and a hash table to store the key-value pairs and iterators to the linked list. The put method inserts or updates a key-value pair, while the get method retrieves a value and updates the usage order. The display method shows the current state of the cache.

2.2 LRU Cache with String Keys and Values

In this example, we will extend the LRU Cache implementation to handle string keys and values, demonstrating the flexibility of the cache.

Code

C++
#include <iostream>
#include <unordered_map>
#include <list>
using namespace std;

class LRUCache {
private:
    int capacity;
    list<string> cache;
    unordered_map<string, pair<string, list<string>::iterator>> map;

public:
    LRUCache(int capacity) : capacity(capacity) {}

    string get(string key) {
        if (map.find(key) == map.end()) {
            return "Not Found";
        } else {
            cache.erase(map[key].second);
            cache.push_front(key);
            map[key].second = cache.begin();
            return map[key].first;
        }
    }

    void put(string key, string value) {
        if (map.find(key) != map.end()) {
            cache.erase(map[key].second);
        } else if (cache.size() >= capacity) {
            string lru = cache.back();
            cache.pop_back();
            map.erase(lru);
        }
        cache.push_front(key);
        map[key] = {value, cache.begin()};
    }

    void display() {
        for (string key : cache) {
            cout << key << ":" << map[key].first << " ";
        }
        cout << endl;
    }
};

int main() {
    LRUCache cache(2);
    cache.put("apple", "red");
    cache.put("banana", "yellow");
    cache.display();
    cout << cache.get("apple") << endl;
    cache.display();
    cache.put("cherry", "red");
    cache.display();
    cout << cache.get("banana") << endl;
    cache.display();
    return 0;
}

Output

C++
banana:yellow apple:red 
red
apple:red banana:yellow 
apple:red cherry:red 
Not Found
cherry:red apple:red 

Explanation

In this example, we extend the LRU Cache implementation to handle string keys and values. The put method inserts or updates string key-value pairs, while the get method retrieves a value and updates the usage order. The display method shows the current state of the cache. This demonstrates the flexibility of the LRU Cache to handle different data types.

2.3 LRU Cache with Time-Based Eviction

In this example, we will extend the LRU Cache to include time-based eviction, where items are removed if they haven’t been accessed within a certain time frame.

Code

C++
#include <iostream>
#include <unordered_map>
#include <list>
#include <ctime>
using namespace std;

class LRUCache {
private:
    int capacity;
    double timeLimit;
    list<pair<int, time_t>> cache;
    unordered_map<int, pair<int, list<pair<int, time_t>>::iterator>> map;

public:
    LRUCache(int capacity, double timeLimit) : capacity(capacity), timeLimit(timeLimit) {}

    int get(int key) {
        time_t currentTime = time(nullptr);
        if (map.find(key) == map.end()) {
            return -1;
        } else if (difftime(currentTime, map[key].second->second) > timeLimit) {
            cache.erase(map[key].second);
            map.erase(key);
            return -1;
        } else {
            cache.erase(map[key].second);
            cache.push_front({key, currentTime});
            map[key].second = cache.begin();
            return map[key].first;
        }
    }

    void put(int key, int value) {
        time_t currentTime = time(nullptr);
        if (map.find(key) != map.end()) {
            cache.erase(map[key].second);
        } else if (cache.size() >= capacity) {
            int lru = cache.back().first;
            cache.pop_back();
            map.erase(lru);
        }
        cache.push_front({key, currentTime});
        map[key] = {value, cache.begin()};
    }

    void display() {
        for (auto& item : cache) {
            cout << item.first << ":" << map[item.first].first << " ";
        }
        cout << endl;
    }
};

int main() {
    LRUCache cache(3, 10.0); // 10 seconds time limit
    cache.put(1, 1);
    cache.put(2, 2);
    cache.put(3, 3);
    cache.display();
    sleep(5);
    cache.put(4, 4);
    cache.display();
    cout << cache.get(2) << endl;
    cache.display();
    sleep(6);
    cout << cache.get(3) << endl;
    cache.display();
    return 0;
}

Output

C++
3:3 2:2 1:1 
4:4 3:3 2:2 
2
2:2 4:4 3:3 
-1
2:2 4:4 

Explanation

In this example, we extend the LRU Cache to include time-based eviction. Items are removed if they haven’t been accessed within a specified time limit. The put method inserts or updates key-value pairs with the current timestamp, and the get method retrieves a value and updates the usage order while checking the time limit. The display method shows the current state of the cache, demonstrating the time-based eviction functionality.

Conclusion

The LRU Cache is an essential data structure for optimizing data retrieval by keeping frequently accessed data readily available. Understanding and implementing an LRU Cache in C++ can significantly enhance the performance of applications that rely on quick data access. By following the examples provided in this article, you should now have a solid foundation for implementing and utilizing LRU Cache in your projects