Skip to content

Conversation

@ashwinb
Copy link
Contributor

@ashwinb ashwinb commented Nov 10, 2025

Fixes issues in the storage system by guaranteeing immediate durability for responses and ensuring background writers stay alive. Three related fixes:

  • Responses to the OpenAI-compatible API now write directly to Postgres/SQLite inside the request instead of detouring through an async queue that might never drain; this restores the expected read-after-write behavior and removes the "response not found" races reported by users.

  • The access-control shim was stamping owner_principal/access_attributes as SQL NULL, which Postgres interprets as non-public rows; fixing it to use the empty-string/JSON-null pattern means conversations and responses stored without an authenticated user stay queryable (matching SQLite).

  • The inference-store queue remains for batching, but its worker tasks now start lazily on the live event loop so server startup doesn't cancel them—writes keep flowing even when the stack is launched via llama stack run.

Closes #4115

Test Plan

Added a matrix entry to test our "base" suite against Postgres as the store.

Fixes issues in the storage system by guaranteeing immediate durability for responses and ensuring background writers stay alive.

Responses to the OpenAI-compatible API now write directly to Postgres/SQLite inside the request instead of detouring through an async queue that might never drain; this restores the expected read-after-write behavior and removes the "response not found" races reported by users.
The access-control shim was stamping owner_principal/access_attributes as SQL NULL, which Postgres interprets as non-public rows; fixing it to use the empty-string/JSON-null pattern means conversations and responses stored without an authenticated user stay queryable (matching SQLite).

The inference-store queue remains for batching, but its worker tasks now start lazily on the live event loop so server startup doesn't cancel them—writes keep flowing even when the stack is launched via llama stack run.
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Nov 10, 2025
responses_store=postgres_config,
),
config=MetaReferenceAgentsImplConfig(
persistence=AgentPersistenceConfig(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

important bug fix, unrelated to this one but really the postgres definition was all kinds of wrong

@ashwinb ashwinb merged commit 492f79c into main Nov 12, 2025
58 checks passed
@ashwinb ashwinb deleted the storage_fix branch November 12, 2025 18:35
@ashwinb
Copy link
Contributor Author

ashwinb commented Nov 12, 2025

@Mergifyio backport release-0.3.x

@mergify
Copy link

mergify bot commented Nov 12, 2025

backport release-0.3.x

✅ Backports have been created

mergify bot pushed a commit that referenced this pull request Nov 12, 2025
Fixes issues in the storage system by guaranteeing immediate durability
for responses and ensuring background writers stay alive. Three related
fixes:

* Responses to the OpenAI-compatible API now write directly to
Postgres/SQLite inside the request instead of detouring through an async
queue that might never drain; this restores the expected
read-after-write behavior and removes the "response not found" races
reported by users.

* The access-control shim was stamping owner_principal/access_attributes
as SQL NULL, which Postgres interprets as non-public rows; fixing it to
use the empty-string/JSON-null pattern means conversations and responses
stored without an authenticated user stay queryable (matching SQLite).

* The inference-store queue remains for batching, but its worker tasks
now start lazily on the live event loop so server startup doesn't cancel
them—writes keep flowing even when the stack is launched via llama stack
run.

Closes #4115

### Test Plan

Added a matrix entry to test our "base" suite against Postgres as the
store.

(cherry picked from commit 492f79c)

# Conflicts:
#	.github/workflows/integration-tests.yml
#	llama_stack/distributions/ci-tests/run-with-postgres-store.yaml
#	llama_stack/distributions/starter-gpu/run.yaml
#	llama_stack/distributions/starter/run.yaml
#	llama_stack/distributions/starter/starter.py
#	llama_stack/providers/utils/inference/inference_store.py
#	llama_stack/providers/utils/responses/responses_store.py
#	tests/integration/ci_matrix.json
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OpenAI Responses and Conversations not persisting with Postgres SQL Stores

5 participants