Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First release #1

Merged
merged 10 commits into from
Jun 7, 2024
Merged

First release #1

merged 10 commits into from
Jun 7, 2024

Conversation

kongzii
Copy link
Collaborator

@kongzii kongzii commented Jun 4, 2024

An interactive docs page is available at https://labs-api.ai.gnosisdev.com/docs.

Results are cached in our Postgres for 3 days.

Screenshot by Dropbox Capture

Comment on lines +17 to +18
if (cached := cache.find(market_id)) is not None:
return cached
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was wondering if this package fastapi-cache would make sense here?
You could have a logic for caching markets for 3 days (else you fetch a new one, save in the DB and serve that).
This has the advantage of preventing DB lookups for every request.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Theoretically, but they say with backends supporting Redis, Memcached, and Amazon DynamoDB, we don't have either of them. I was thinking about using Redis, but then I didn't want to complicate it.

This has the advantage of preventing DB lookups for every request.

I don't follow, how do you prevent lookups for new requests? For each one we need to check if we have a recent summary or not.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Theoretically, but they say with backends supporting Redis, Memcached, and Amazon DynamoDB, we don't have either of them. I was thinking about using Redis, but then I didn't want to complicate it.

They also support in-memory, which should be good enough (link).

I don't follow, how do you prevent lookups for new requests? For each one we need to check if we have a recent summary or not.

What I meant was - assuming you have in-memory cache, checking an entry by key (e.g. market_id) is much cheaper than the flight time app <> DB, which should only happen if the cache entry is invalid (for ex, after 3 days, we should update the market in the DB). Does that make sense?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I see; for in-memory, you don't even need fastapi-cache, there is a built-in decorator that works nicely: https://docs.python.org/3/library/functools.html#functools.cache

However, the time of checking DB for a hit is negligible in comparison to calling Tavily/LLM, and having an in-memory cache on the app-level isn't a good idea.

One problem is growing memory and the second is, the fact that there isn't in-memory cache hit doesn't mean that there won't be in-DB cache hit (for example due to pod randomly restarting, or because of a new release restart, or theoretically we could have multiple pods to handle traffic in the future), so checking DB would be required anyway.

Comment on lines +15 to +21
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pretty sure you acknowledged this somewhere - do we have an issue for adding auth? Otherwise our Tavily credits will suffer.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's cached, and the only way to call the endpoint is by using market_id from Omen, anything else will fail before using Tavily. It seemed to me there wasn't a way for someone to abuse this API. Do you see some use-case?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Scripting a way to fetch all Omen markets and pinging the endpoint is the only way I can think of.
But yes, it's not very critical.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I guess that's the worst-case scenario, where there are many many people on Omen, and they can look at all markets

@kongzii kongzii merged commit 453aa28 into main Jun 7, 2024
6 checks passed
@kongzii kongzii deleted the peter/init branch June 7, 2024 06:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants