-
Notifications
You must be signed in to change notification settings - Fork 1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
First release #1
Conversation
if (cached := cache.find(market_id)) is not None: | ||
return cached |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was wondering if this package fastapi-cache would make sense here?
You could have a logic for caching markets for 3 days (else you fetch a new one, save in the DB and serve that).
This has the advantage of preventing DB lookups for every request.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Theoretically, but they say with backends supporting Redis, Memcached, and Amazon DynamoDB
, we don't have either of them. I was thinking about using Redis, but then I didn't want to complicate it.
This has the advantage of preventing DB lookups for every request.
I don't follow, how do you prevent lookups for new requests? For each one we need to check if we have a recent summary or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Theoretically, but they say with backends supporting Redis, Memcached, and Amazon DynamoDB, we don't have either of them. I was thinking about using Redis, but then I didn't want to complicate it.
They also support in-memory
, which should be good enough (link).
I don't follow, how do you prevent lookups for new requests? For each one we need to check if we have a recent summary or not.
What I meant was - assuming you have in-memory cache, checking an entry by key (e.g. market_id) is much cheaper than the flight time app <> DB, which should only happen if the cache entry is invalid (for ex, after 3 days, we should update the market in the DB). Does that make sense?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I see; for in-memory, you don't even need fastapi-cache, there is a built-in decorator that works nicely: https://docs.python.org/3/library/functools.html#functools.cache
However, the time of checking DB for a hit is negligible in comparison to calling Tavily/LLM, and having an in-memory cache on the app-level isn't a good idea.
One problem is growing memory and the second is, the fact that there isn't in-memory cache hit doesn't mean that there won't be in-DB cache hit (for example due to pod randomly restarting, or because of a new release restart, or theoretically we could have multiple pods to handle traffic in the future), so checking DB would be required anyway.
app.add_middleware( | ||
CORSMiddleware, | ||
allow_origins=["*"], | ||
allow_credentials=True, | ||
allow_methods=["*"], | ||
allow_headers=["*"], | ||
) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pretty sure you acknowledged this somewhere - do we have an issue for adding auth? Otherwise our Tavily credits will suffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's cached, and the only way to call the endpoint is by using market_id from Omen, anything else will fail before using Tavily. It seemed to me there wasn't a way for someone to abuse this API. Do you see some use-case?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Scripting a way to fetch all Omen markets and pinging the endpoint is the only way I can think of.
But yes, it's not very critical.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess that's the worst-case scenario, where there are many many people on Omen, and they can look at all markets
An interactive docs page is available at https://labs-api.ai.gnosisdev.com/docs.
Results are cached in our Postgres for 3 days.