Algorithm - LRU
In computer science, there are many cache algorithms in order to optimize the cache of information stored on the computer.
Normally, there is limited capacity for the cache allocated on the computer, and the computer needs to update the cache or delete the cache if the cache is full. There are many common algorithms for optimizing cache on a computer like: FIFO (First in first out), LFU (Least frequently used), LRU (Least recently used).
Today we are focusing on implementing the LRU (Least recently used) cache algorithm.
I will be using this problem 146. LRU Cache from LeetCode as the guideline for the implementation. Basically, we need to implement the following class:
1 | class LRUCache: |
Analyze
A rough thought is that we need to maintain a list-like data structure to store the cache sequentially, and update the list accordingly after each get
or put
operation.
The data towards the end of the list is visited earlier and the data at the beginning is the most recent visited one.
Here is a list represents the cache. It has the capacity of 6, in which 5 of them are filled with data(1).
If 3 got visited(2), it will be moved to the beginning of the list(3).
If we put 6 and 7 into the cache, 5 will be moved out of the cache since it reaches the capacity of the cache(4).
Using Array
Using an array is pretty straightforward. We have defined a few properties to the class:
cacheData
: a dictionary to store the key and valuecacheKey
: an array to store only the keys in a specific order
1 | class LRUCache_Array: |
Since we are manipulating the key in an array, for example remove
or concatenation
, the time complexity of those operations is O(n). And the lookup time on a dictionary in Python is O(1). So the time complexity of each operation(get
and put
) in this implementation will be O(n).
Using Doubly Linked List
For the doubly linked list, we will need to define an extra ListNode
class, which has prev
and next
pointing to its previous node and next node respectively.
1 | # define the double linked-list node |
Then we will need to define a few properties in our implementation:
cacheData
: a dictionary to store the key and valuehead
: a head node pointing to the first node of the linked list, initialized asNone
tail
: a tail node pointing to the last node of the linked list, initialized asNone
1 | # Used double linked-list to maintain the node. O(1) |
Since the time complexity of insertion
or deletion
operation on a doubly linked list is O(1), and the lookup time on a dictionary in Python is O(1). So the time complexity of each operation(get
and put
) in this implementation will be O(1).