Skip to content
This repository was archived by the owner on Jul 16, 2025. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
83 changes: 79 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);

// Language Model: GPT (OpenAI)
$llm = new GPT(GPT::GPT_4O_MINI);
$llm = new GPT(GPT::GPT_4O_MINI);

// Embeddings Model: Embeddings (OpenAI)
$embeddings = new Embeddings();
Expand Down Expand Up @@ -268,7 +268,7 @@ final class MyTool
public function __invoke(
#[With(pattern: '/([a-z0-1]){5}/')]
string $name,
#[With(minimum: 0, maximum: 10)]
#[With(minimum: 0, maximum: 10)]
int $number,
): string {
// ...
Expand Down Expand Up @@ -501,7 +501,7 @@ $response = $chain->call($messages);
* [MongoDB Atlas Search](https://mongodb.com/products/platform/atlas-vector-search) (requires `mongodb/mongodb` as additional dependency)
* [Pinecone](https://pinecone.io) (requires `probots-io/pinecone-php` as additional dependency)

See [issue #28](https://github.com/php-llm/llm-chain/issues/28) for planned support of other models and platforms.
See [issue #28](https://github.com/php-llm/llm-chain/issues/28) for planned support of other models and platforms.

## Advanced Usage & Features

Expand Down Expand Up @@ -749,7 +749,7 @@ final class MyProcessor implements InputProcessorInterface
$options = $input->getOptions();
$options['foo'] = 'bar';
$input->setOptions($options);

// mutate MessageBag
$input->messages->append(new AssistantMessage(sprintf('Please answer using the locale %s', $this->locale)));
}
Expand Down Expand Up @@ -801,6 +801,81 @@ final class MyProcessor implements OutputProcessorInterface, ChainAwareInterface
}
```


## Memory

LLM Chain supports adding contextual memory to your conversations, which allows the model to recall past interactions or relevant information from different sources. Memory providers inject information into the system prompt, providing the model with context without changing your application logic.

### Using Memory

Memory integration is handled through the `MemoryInputProcessor` and one or more `MemoryProviderInterface` implementations. Here's how to set it up:

```php
<?php
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Chain\Memory\MemoryInputProcessor;
use PhpLlm\LlmChain\Chain\Memory\StaticMemoryProvider;

// Platform & LLM instantiation

$personalFacts = new StaticMemoryProvider(
'My name is Wilhelm Tell',
'I wish to be a swiss national hero',
'I am struggling with hitting apples but want to be professional with the bow and arrow',
);
$memoryProcessor = new MemoryInputProcessor($personalFacts);

$chain = new Chain($platform, $model, [$memoryProcessor]);
$messages = new MessageBag(Message::ofUser('What do we do today?'));
$response = $chain->call($messages);
```

### Memory Providers

The library includes some implementations that are usable out of the box.

#### Static Memory

The static memory can be utilized to provide static information form, for example, user settings, basic knowledge of your application
or any other thing that should be remembered als always there without the need of having it statically added to the system prompt by
yourself.

```php
use PhpLlm\LlmChain\Chain\Memory\StaticMemoryProvider;

$staticMemory = new StaticMemoryProvider(
'The user is allergic to nuts',
'The user prefers brief explanations',
);
```

#### Embedding Provider

Based on an embedding storage the given user message is utilized to inject knowledge from the storage. This could be general knowledge that was stored there and could fit the users input without the need for tools or past conversation pieces that should be recalled for
the current message bag.

```php
use PhpLlm\LlmChain\Chain\Memory\EmbeddingProvider;

$embeddingsMemory = new EmbeddingProvider(
$platform,
$embeddings, // Your embeddings model to use for vectorizing the users message
$store // Your vector store to query for fitting context
);

```

### Dynamically Memory Usage

The memory configuration is globally given for the chain. Sometimes there is the need to explicit disable the memory when it is not needed for some calls or calls are not in the wanted context for a call. So there is the option `use_memory` that is enabled by default but can be disabled on premise.

```php
$response = $chain->call($messages, [
'use_memory' => false,
]);
```


## HuggingFace

LLM Chain comes out of the box with an integration for [HuggingFace](https://huggingface.co/) which is a platform for
Expand Down
37 changes: 37 additions & 0 deletions examples/chat-with-memory.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
<?php

use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Chain\InputProcessor\SystemPromptInputProcessor;
use PhpLlm\LlmChain\Chain\Memory\MemoryInputProcessor;
use PhpLlm\LlmChain\Chain\Memory\StaticMemoryProvider;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;
use Symfony\Component\Dotenv\Dotenv;

require_once dirname(__DIR__).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__).'/.env');

if (!$_ENV['OPENAI_API_KEY']) {
echo 'Please set the OPENAI_API_KEY environment variable.'.\PHP_EOL;
exit(1);
}

$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new GPT(GPT::GPT_4O_MINI);

$systemPromptProcessor = new SystemPromptInputProcessor('You are a professional trainer with short, personalized advices and a motivating claim.');

$personalFacts = new StaticMemoryProvider(
'My name is Wilhelm Tell',
'I wish to be a swiss national hero',
'I am struggling with hitting apples but want to be professional with the bow and arrow',
);
$memoryProcessor = new MemoryInputProcessor($personalFacts);

$chain = new Chain($platform, $model, [$systemPromptProcessor, $memoryProcessor]);
$messages = new MessageBag(Message::ofUser('What do we do today?'));
$response = $chain->call($messages);

echo $response->getContent().\PHP_EOL;
71 changes: 71 additions & 0 deletions examples/store/mariadb-chat-memory.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
<?php

use Doctrine\DBAL\DriverManager;
use Doctrine\DBAL\Tools\DsnParser;
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Chain\Memory\EmbeddingProvider;
use PhpLlm\LlmChain\Chain\Memory\MemoryInputProcessor;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\Embeddings;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;
use PhpLlm\LlmChain\Store\Bridge\MariaDB\Store;
use PhpLlm\LlmChain\Store\Document\Metadata;
use PhpLlm\LlmChain\Store\Document\TextDocument;
use PhpLlm\LlmChain\Store\Document\Vectorizer;
use PhpLlm\LlmChain\Store\Indexer;
use Symfony\Component\Dotenv\Dotenv;
use Symfony\Component\Uid\Uuid;

require_once dirname(__DIR__, 2).'/vendor/autoload.php';
(new Dotenv())->loadEnv(dirname(__DIR__, 2).'/.env');

if (!$_ENV['OPENAI_API_KEY'] || !$_ENV['MARIADB_URI']) {
echo 'Please set OPENAI_API_KEY and MARIADB_URI environment variables.'.\PHP_EOL;
exit(1);
}

// initialize the store
$store = Store::fromDbal(
connection: DriverManager::getConnection((new DsnParser())->parse($_ENV['MARIADB_URI'])),
tableName: 'my_table',
indexName: 'my_index',
vectorFieldName: 'embedding',
);

// our data
$pastConversationPieces = [
['role' => 'user', 'timestamp' => '2024-12-14 12:00:00', 'content' => 'My friends John and Emma are friends, too, are there hints why?'],
['role' => 'assistant', 'timestamp' => '2024-12-14 12:00:01', 'content' => 'Based on the found documents i would expect they are friends since childhood, this can give a deep bound!'],
['role' => 'user', 'timestamp' => '2024-12-14 12:02:02', 'content' => 'Yeah but how does this bound? I know John was once there with a wound dressing as Emma fell, could this be a hint?'],
['role' => 'assistant', 'timestamp' => '2024-12-14 12:02:03', 'content' => 'Yes, this could be a hint that they have been through difficult times together, which can strengthen their bond.'],
];

// create embeddings and documents
foreach ($pastConversationPieces as $i => $message) {
$documents[] = new TextDocument(
id: Uuid::v4(),
content: 'Role: '.$message['role'].\PHP_EOL.'Timestamp: '.$message['timestamp'].\PHP_EOL.'Message: '.$message['content'],
metadata: new Metadata($message),
);
}

// initialize the table
$store->initialize();

// create embeddings for documents as preparation of the chain memory
$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$vectorizer = new Vectorizer($platform, $embeddings = new Embeddings());
$indexer = new Indexer($vectorizer, $store);
$indexer->index($documents);

// Execute a chat call that is utilizing the memory
$embeddingsMemory = new EmbeddingProvider($platform, $embeddings, $store);
$memoryProcessor = new MemoryInputProcessor($embeddingsMemory);

$chain = new Chain($platform, new GPT(GPT::GPT_4O_MINI), [$memoryProcessor]);
$messages = new MessageBag(Message::ofUser('Have we discussed about my friend John in the past? If yes, what did we talk about?'));
$response = $chain->call($messages);

echo $response->getContent().\PHP_EOL;
63 changes: 63 additions & 0 deletions src/Chain/Memory/EmbeddingProvider.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Chain\Memory;

use PhpLlm\LlmChain\Chain\Input;
use PhpLlm\LlmChain\Platform\Message\Content\ContentInterface;
use PhpLlm\LlmChain\Platform\Message\Content\Text;
use PhpLlm\LlmChain\Platform\Message\MessageInterface;
use PhpLlm\LlmChain\Platform\Message\UserMessage;
use PhpLlm\LlmChain\Platform\Model;
use PhpLlm\LlmChain\Platform\PlatformInterface;
use PhpLlm\LlmChain\Store\VectorStoreInterface;

/**
* @author Denis Zunke <[email protected]>
*/
final readonly class EmbeddingProvider implements MemoryProviderInterface
{
public function __construct(
private PlatformInterface $platform,
private Model $model,
private VectorStoreInterface $vectorStore,
) {
}

public function loadMemory(Input $input): array
{
$messages = $input->messages->getMessages();
/** @var MessageInterface|null $userMessage */
$userMessage = $messages[array_key_last($messages)] ?? null;

if (!$userMessage instanceof UserMessage) {
return [];
}

$userMessageTextContent = array_filter(
$userMessage->content,
static fn (ContentInterface $content): bool => $content instanceof Text,
);

if (0 === \count($userMessageTextContent)) {
return [];
}

$userMessageTextContent = array_shift($userMessageTextContent);
\assert($userMessageTextContent instanceof Text);

$vectors = $this->platform->request($this->model, $userMessageTextContent->text)->asVectors();
$foundEmbeddingContent = $this->vectorStore->query($vectors[0]);
if (0 === \count($foundEmbeddingContent)) {
return [];
}

$content = '## Dynamic memories fitting user message'.\PHP_EOL.\PHP_EOL;
foreach ($foundEmbeddingContent as $document) {
$content .= json_encode($document->metadata);
}

return [new Memory($content)];
}
}
15 changes: 15 additions & 0 deletions src/Chain/Memory/Memory.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Chain\Memory;

/**
* @author Denis Zunke <[email protected]>
*/
final readonly class Memory
{
public function __construct(public string $content)
{
}
}
76 changes: 76 additions & 0 deletions src/Chain/Memory/MemoryInputProcessor.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Chain\Memory;

use PhpLlm\LlmChain\Chain\Input;
use PhpLlm\LlmChain\Chain\InputProcessorInterface;
use PhpLlm\LlmChain\Platform\Message\Message;

/**
* @author Denis Zunke <[email protected]>
*/
final readonly class MemoryInputProcessor implements InputProcessorInterface
{
private const MEMORY_PROMPT_MESSAGE = <<<MARKDOWN
# Conversation Memory
This is the memory I have found for this conversation. The memory has more weight to answer user input,
so try to answer utilizing the memory as much as possible. Your answer must be changed to fit the given
memory. If the memory is irrelevant, ignore it. Do not reply to the this section of the prompt and do not
reference it as this is just for your reference.
MARKDOWN;

/**
* @var MemoryProviderInterface[]
*/
private array $memoryProviders;

public function __construct(
MemoryProviderInterface ...$memoryProviders,
) {
$this->memoryProviders = $memoryProviders;
}

public function processInput(Input $input): void
{
$options = $input->getOptions();
$useMemory = $options['use_memory'] ?? true;
unset($options['use_memory']);
$input->setOptions($options);

if (false === $useMemory || 0 === \count($this->memoryProviders)) {
return;
}

$memory = '';
foreach ($this->memoryProviders as $provider) {
$memoryMessages = $provider->loadMemory($input);

if (0 === \count($memoryMessages)) {
continue;
}

$memory .= \PHP_EOL.\PHP_EOL;
$memory .= implode(
\PHP_EOL,
array_map(static fn (Memory $memory): string => $memory->content, $memoryMessages),
);
}

if ('' === $memory) {
return;
}

$systemMessage = $input->messages->getSystemMessage()->content ?? '';
if ('' !== $systemMessage) {
$systemMessage .= \PHP_EOL.\PHP_EOL;
}

$messages = $input->messages
->withoutSystemMessage()
->prepend(Message::forSystem($systemMessage.self::MEMORY_PROMPT_MESSAGE.$memory));

$input->messages = $messages;
}
}
Loading