Skip to content

Any docs on eval in usearch #1

Answered by ericfeunekes
ericfeunekes asked this question in Q&A
Discussion options

You must be logged in to vote

Done! Even something very minimal showing how the classes are intended to be used together would be awesome.

This is very timely for me because I'm planning to do some testing around chunking strategies. Basically, I have labelled ordered chunks that should be returned from a retriever. I'm going to test it by getting the most similar chunk (first semantically, then edit distance using StringZilla) and then use edit distance as the error (and then scoring with MSE) to show me which chunking strategy gets me closest to what my labellers thing are the right chunks.

Based on what I've seen, usearch looks perfect for this, particularly if you have evaluation built right in.

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ericfeunekes
Comment options

Answer selected by ashvardanian
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants