Skip to main content

LRU Cache

SpaceTime
AccessLookupInsertionDeletion
O(n)O(1)O(log n)O(log n)O(log n)

Definition​

LRU Cache is a cache data structure that evicts the least recently used item when it is full.

Simplified

LRU Cache is like a clothes rack. It holds a limited amount of data. When new data comes in, it removes the least recently used data, similar to removing the least worn clothing from a full rack to make room for a new one. This keeps frequently used data quickly accessible.

Practice​

AspectPseudo Code
LRU Cache on Map
get(key):
if key not in items:
return ø
value = items.get(key)
items.delete(key)
items.set(key, value)
return value

set(key, value):
items.delete(key)
items.set(key, value)
if items.size > capacity:
for each head_key in items.keys():
items.delete(head_key)
break
LRU Cache on Linked List
get(key):
if nodes_map[key] == ø:
return ø
node = nodes_map[key]
promote(node)
return node.value

set(key, value):
if nodes_map[key] == ø:
node = LinkedListNode(key, value)
append(node)
else:
node = nodes_map[key]
node.value = value
promote(node)

promote(node):
evict(node)
append(node)

append(node):
nodes_map[node.key] = node
if head.next != ø:
head.next = node
tail.prev = node
node.prev = head
node.next = tail
else:
old_tail = tail.prev
old_tail.next = node
node.prev = old_tail
node.next = tail
tail.prev = node
size += 1
if size > capacity:
evict(head.next)

evict(node):
delete nodes_map[node.key]
size -= 1
prev_node = node.prev
next_node = node.next
if prev_node == head and next_node == tail:
head.next = ø
tail.prev = ø
size = 0
else if prev_node == head:
next_node.prev = head
head.next = next_node
else if next_node == tail:
prev_node.next = tail
tail.prev = prev_node
else:
prev_node.next = next_node
next_node.prev = prev_node