Which Java Collection Class to Implement LRU Cache?


A cache is a method to store some data temporarily which can be accessed faster to improve software performance. There can be different types of cache but in general, we use LRU (Least recently used entry will evict first in case of cache size full) and cache with TTL (Time to live, entries will be evicted automatically after time laps).

Nowadays for larger and scalable system designs, there are many open-source distributed cache systems are available, and for in-process cache also there are many good libraries. You can check my post here in-memory cache in java.

For building any cache, it should provide an interface to put and get entry in form of key pair. So first we need one Map and for maintaining order we can use a Queue with a doubly-linked list or a simple doubly linked list. So using HashMap and LinkedList we can implement an LRU cache. But wait Java has already one collection map LinkedHashMap which implements this.

LinkedHashMap can maintain order or element in which they are inserted and optionally it can maintain order in which they are accessed if it is constructed using the constructor with accessOrder parameter true.

LinkedHash map will also update any existing entry.


    /**
     * Constructs an empty LinkedHashMap instance with the
     * specified initial capacity, load factor and ordering mode.
     *
     * @param  initialCapacity the initial capacity
     * @param  loadFactor      the load factor
     * @param  accessOrder     the ordering mode - true for
     *         access-order, false for insertion-order
     * @throws IllegalArgumentException if the initial capacity is negative
     *         or the load factor is nonpositive
     */
    public LinkedHashMap(int initialCapacity,
                         float loadFactor,
                         boolean accessOrder)

To remove the eldest entry we can override removeElestEntry method to check cache size.

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        // Remove the eldest element whenever size of cache exceeds the capacity
        return (size() > this.capacity);
    }

Complete working Example of LRU cache using Linked HashMap

import java.util.LinkedHashMap;
import java.util.Map;

public class LRUCache<K, V> extends LinkedHashMap<K, V> {

    private final int capacity;

    @Override
    protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
        // Remove the eldest element whenever size of cache exceeds the capacity
        return (size() > this.capacity);
    }

    public LRUCache(int capacity) {
        // LinkedHashMap is core java class, if it constructed using accessOrder true
        // then it will maintain order in accessed order
        super(capacity + 1, 1.0f, true);
        this.capacity = capacity;
    }

    @Override
    public V get(Object key) {
        return super.get(key);
    }

    @Override
    public V put(K key, V value) {
       return  super.put(key, value);
    }

    public static void main(String[] args) {
        // Create the cache with capacity 2
        LRUCache<Integer, String> cache = new LRUCache<>(
                2);

        cache.put(2, "One");
        cache.put(3, "Two");
        cache.put(4, "Three");

        // Since element with key 2 was removed, it will return null
        System.out.println(cache.get(2));
        // It will return value 2 and move the element with key 3 to the head.
        // After this point, element with key 4 will be least recently accessed
        System.out.println(cache.get(3));

        // Will add an element with key as 5 and value as 4. Also will remove
        // the element with key 4 as it was accessed least recently and cache
        // can just have two elements at a time
        cache.put(5, "Four");

        // Since element with key 2 was removed, it will return null
        System.out.println(cache.get(4));
    }
}

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.