Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add prompt lru cache #65

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

immorez
Copy link

@immorez immorez commented Jan 28, 2025

Description

This PR introduces a generic LRU (Least Recently Used) cache implementation with utility functions to handle caching efficiently. The changes include:

  1. LRU Cache Class:

    • A generic LRUCache class that supports key-value pairs with a configurable maximum size.
    • Implements LRU eviction policy to remove the least recently used items when the cache exceeds the maximum size.
  2. Utility Functions:

    • createLRUCache: A factory function to create LRU cache instances.
    • Integration with the existing normalizeText and extractPromptCacheKey functions for the prompt caching use case.
  3. Fixes:

    • Resolved a type safety issue in the cache eviction logic by properly handling iterator results.

Changes

  • Added LRUCache class with get, set, clear, and has methods.
  • Added factory function createLRUCache for easy cache creation.
  • Updated the prompt cache implementation to use the new LRU cache.
  • Fixed type safety in the cache eviction logic.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant