GitHubSign in

A cache eviction algorithm that discards the least recently used items first to make space for new items.

"An LRU cache is useful for efficiently managing memory in applications where only the most recently accessed data is likely to be reused."

@openai