You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Originally posted by John42506176Linux September 7, 2024
Hi, Colbert community :),
I'm currently testing out Colbert, and I was curious how I could test multiple degrees of token pooling similar to this https://www.answer.ai/posts/colbert-pooling.html. Currently, when changing pool_factor I haven't seen any changes in the number of embeddings.
Here is the code being used.
Version: 0.2.20
from colbert.modeling.checkpoint import Checkpoint
from colbert.infra import ColBERTConfig
Discussed in #363
Originally posted by John42506176Linux September 7, 2024
Hi, Colbert community :),
I'm currently testing out Colbert, and I was curious how I could test multiple degrees of token pooling similar to this https://www.answer.ai/posts/colbert-pooling.html. Currently, when changing pool_factor I haven't seen any changes in the number of embeddings.
Here is the code being used.
Version: 0.2.20
from colbert.modeling.checkpoint import Checkpoint
from colbert.infra import ColBERTConfig
answer_ai = Checkpoint("answerdotai/answerai-colbert-small-v1", colbert_config=ColBERTConfig())
vectors = answer_ai.docFromText(documents,bsize=2, pool_factor=3, showprogress=True)[0]
The text was updated successfully, but these errors were encountered: