LRU Cache Implementation in Java With Examples

What is LRU Cache?

LRU Cache stands for Least Recently Used Cache. The size of the cache is fixed and it supports get() and put() operations. When the cache is full, the put() operation removes the least recently used cache.

How to implement LRU Cache in Java?

The LRU cache can be implemented in Java using two data structures – HashMap and a doubly-linked list to store the data.

The idea is to always have the elements in the following order.

Least Recently Used (LRU) -> A <-> B <-> C <-> D <-> E <- Most Recently Used (MRU)

Here is the LRUCache implementation in Java.

Some important points about the implementation.

  • The Node class is used to implement doubly linked list structure. It has references to the previous and next node.
  • When we are getting a value using the get() method, the element is moved to the rightmost position because it becomes MRU element.
  • When we are putting an element in the cache, it’s added to the rightmost side. If the container is full, then the LRU element is removed.
  • We are using Generics so that we can use the implementation with any types of data.

LRUCache Test Program

I am using JUnit test cases to run some scenarios from the LeetCode question.

 

LRU Cache Test

Conclusion

There are many ways to implement LRU Cache. But, if we keep the cache elements in the order of their usage, then we can get better performance in the put() operation.

References

By admin

Leave a Reply

%d bloggers like this: