DDSA Solutions

146. LRU Cache

Advertisement

Intuition

An LRU cache needs O(1) lookup and O(1) reordering to track recency. A hash map gives O(1) lookup. A doubly-linked list gives O(1) insertion and deletion at any position. Combining both: the map stores key -> node, and the list maintains order from most-recent (head) to least-recent (tail).

Algorithm

  1. 1Create dummy head and tail nodes. Connect them. Map starts empty.
  2. 2get(key): if not in map, return -1. Otherwise move the node to just after the dummy head and return its value.
  3. 3put(key, value): if key exists, update value and move to front. If new: insert new node at front.
  4. 4After inserting a new node: if size > capacity, remove the node just before the dummy tail (LRU) and delete its key from the map.
  5. 5Helper Remove(node) unlinks a node in O(1). Helper InsertFront(node) inserts after head in O(1).

Example Walkthrough

Input: capacity=2, operations: put(1,1), put(2,2), get(1), put(3,3), get(2)

  1. 1.put(1,1): list = [1]. put(2,2): list = [2,1] (2 is most recent).
  2. 2.get(1): move 1 to front -> list = [1,2]. Return 1.
  3. 3.put(3,3): list full. Evict LRU (tail = 2). list = [3,1].
  4. 4.get(2): key 2 was evicted -> return -1.

Output: get(1)=1, get(2)=-1

Common Pitfalls

  • Sentinel dummy head and tail nodes eliminate all null checks in Remove/InsertFront.
  • Always remove the old node before inserting the updated one to avoid the map having two entries for the same key.
  • On eviction, remove from BOTH the list and the map or subsequent gets will return stale data.
146.cs
C#
// Approach: Combine a doubly-linked list with a Dictionary<key, node> for O(1) get and put.
// The list maintains LRU order: most-recently used node sits at the head, least-recently at the tail.
// On get: if the key exists, detach the node, move it to the head, and return its value.
// On put: if the key exists, update and move to head; if new, insert a new node at the head.
// If inserting exceeds capacity, remove the node at the tail (least recently used) and its map entry.
// Sentinel head and tail dummy nodes eliminate null-pointer edge cases in all list operations.
// Time: O(1) per get/put. Space: O(capacity) for the list and map combined.

public class LRUCache
{
    private int capacity;
    private Node head = new Node(0, 0), tail = new Node(0, 0);
    private Dictionary<int, Node> map = new Dictionary<int, Node>();

    public LRUCache(int capacity)
    {
        this.capacity = capacity;
        head.next = tail;
        tail.prev = head;
    }

    public int Get(int key)
    {
        if (!map.ContainsKey(key))
            return -1;

        Node node = map[key];
        remove(node);
        insert(node);
        return node.Value;
    }

    public void Put(int key, int value)
    {
        if (map.ContainsKey(key))
            remove(map[key]);

        if (map.Count == capacity)
            remove(tail.prev);

        insert(new Node(key, value));
    }

    private void remove(Node node)
    {
        map.Remove(node.Key);
        node.prev.next = node.next;
        node.next.prev = node.prev;
    }

    private void insert(Node node)
    {
        map.Add(node.Key, node);
        node.next = head.next;
        node.next.prev = node;
        head.next = node;
        node.prev = head;
    }
}

class Node
{
    public int Key;
    public int Value;
    public Node prev, next;
    public Node(int key, int value)
    {
        this.Key = key;
        this.Value = value;
    }
}

/**
 * Your LRUCache object will be instantiated and called as such:
 * LRUCache obj = new LRUCache(capacity);
 * int param_1 = obj.Get(key);
 * obj.Put(key,value);
 */
Advertisement
Was this solution helpful?