Description
Summary
Messages generated by Ollama are incorrectly appended to the wrong conversation when switching between conversations during active message generation.
Environment
- Application: Enchanted
- Platform: macOS
- Component: ConversationStore.swift
- Class: ConversationStore
Description
When a user initiates a conversation with Ollama and switches to a different conversation while the AI is still generating a response, the generated content continues to be appended to the newly selected conversation instead of the original one. This creates message pollution across conversations and corrupts the chat history.
Steps to Reproduce
- Open Enchanted application
- Start a new conversation (Conversation A)
- Send a prompt that will generate a long response (e.g., "Write a detailed essay about...")
- While the response is still being generated, switch to a different conversation (Conversation B)
- Observe that the continuing response from Conversation A is being appended to the last message in Conversation B
Expected Behavior
- The generated response should only be appended to the conversation where it was initiated
- Switching conversations during generation should either:
- Stop the current generation
- Or continue generating in the background but only update the original conversation
Actual Behavior
- Generated response content is appended to whatever conversation is currently selected
- Message history becomes corrupted with responses appearing in the wrong conversations
- No warning or indication to the user that messages are being mixed
Technical Analysis
The issue stems from the ConversationStore class not tracking which conversation initiated the generation. The handleReceive method blindly appends content to the last message in the current messages array, which changes when switching conversations.
Relevant Code
@MainActor
private func handleReceive(_ response: OKChatResponse) {
if messages.isEmpty { return }
if let responseContent = response.message?.content {
currentMessageBuffer = currentMessageBuffer + responseContent
throttler.throttle { [weak self] in
guard let self = self else { return }
let lastIndex = self.messages.count - 1
self.messages[lastIndex].content.append(currentMessageBuffer) // Issue: No conversation check
currentMessageBuffer = ""
}
}
}
Impact
- Data Integrity: Chat histories become mixed and unreliable
- User Experience: Users may not notice their conversations are being mixed
- Data Persistence: Corrupted conversation histories are saved to the database
Proposed Fix
Add conversation tracking:
- Implement an activeConversationId property to track the generating conversation
- Add conversation ID verification before appending messages
- Implement proper cleanup when switching conversations