Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add loss value metric based on optimal performance definition #66

Merged
merged 40 commits into from
Oct 31, 2024

Conversation

jteijema
Copy link
Member

@jteijema jteijema commented Aug 27, 2024

A loss metric value based on the distance between the perfect recall curve and the actual recall curve.

output

@J535D165
Copy link
Member

J535D165 commented Sep 2, 2024

Failing tests seem unrelated, but do you have an idea?

@J535D165
Copy link
Member

J535D165 commented Sep 2, 2024

Linter fixed in main branch.

@jteijema
Copy link
Member Author

jteijema commented Sep 4, 2024

No, but I figured it wasn't on my branch indeed. Ill take another look!

@jteijema
Copy link
Member Author

jteijema commented Sep 4, 2024

The other tests are still failing. I think tf-keras needs to be a req for ASReview. But more importantly, why is it importing transformers when running our tests...

@jteijema
Copy link
Member Author

Tests are failing because of something fixed in #67 . Consider merging 67 and merging main into this branch to get the tests functional again.

@J535D165
Copy link
Member

Oke, remove the draft status of that pr.

asreviewcontrib/insights/algorithms.py Outdated Show resolved Hide resolved
@J535D165 J535D165 changed the title add loss value metric Add loss value metric based on optimal performance definition Sep 26, 2024
@J535D165
Copy link
Member

This PR needs a test and documentation. I'm not convinced this loss metric computes what's described in the PR description.

The following tests both fail:

def test_metric_loss_best():
    labels_best = [1, 1, 1, 0]
    loss = _loss(labels_best)

    assert_almost_equal(loss, 0)

def test_metric_loss_worst():
    labels_worst = [0, 0, 1, 1]
    loss = _loss(labels_worst)

    assert_almost_equal(loss, 1)

@jteijema
Copy link
Member Author

This PR needs a test and documentation. I'm not convinced this loss metric computes what's described in the PR description.

The following tests both fail:

def test_metric_loss_best():
    labels_best = [1, 1, 1, 0]
    loss = _loss(labels_best)

    assert_almost_equal(loss, 0)

def test_metric_loss_worst():
    labels_worst = [0, 0, 1, 1]
    loss = _loss(labels_worst)

    assert_almost_equal(loss, 1)

Yeah you're right. the calculation was okay, but it wasn't normalized. Here's the new approach:

image

Copy link
Member

@J535D165 J535D165 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the update. Looks like a good step forward.

I'm also interesting in adding this metric to asreview/asreview. This can be nice for the CLI simulation interface. Are you interested in contributing a copy there?

asreviewcontrib/insights/algorithms.py Outdated Show resolved Hide resolved
asreviewcontrib/insights/algorithms.py Outdated Show resolved Hide resolved
asreviewcontrib/insights/algorithms.py Outdated Show resolved Hide resolved
asreviewcontrib/insights/metrics.py Outdated Show resolved Hide resolved
asreviewcontrib/insights/metrics.py Outdated Show resolved Hide resolved
asreviewcontrib/insights/metrics.py Outdated Show resolved Hide resolved
tests/test_metrics.py Outdated Show resolved Hide resolved
tests/test_metrics.py Outdated Show resolved Hide resolved
tests/test_metrics.py Outdated Show resolved Hide resolved
tests/test_metrics.py Outdated Show resolved Hide resolved
@jteijema
Copy link
Member Author

Figure_1

@jteijema jteijema requested a review from J535D165 October 31, 2024 12:09
@J535D165 J535D165 merged commit 5a15875 into asreview:main Oct 31, 2024
5 checks passed
@J535D165
Copy link
Member

Thanks for this contribution to the performance metrics!

@J535D165 J535D165 added the enhancement New feature or request label Oct 31, 2024
@jteijema jteijema deleted the loss-metric branch October 31, 2024 14:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants