Skip to content

Commit

Permalink
🔥 Remove the Gen. Jensen-Shannon div.
Browse files Browse the repository at this point in the history
  • Loading branch information
o-laurent committed Aug 24, 2023
1 parent e00f34a commit 2386247
Show file tree
Hide file tree
Showing 4 changed files with 2 additions and 182 deletions.
59 changes: 0 additions & 59 deletions tests/metrics/test_jensen_shannon_divergence.py

This file was deleted.

1 change: 0 additions & 1 deletion torch_uncertainty/metrics/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,6 @@
from .disagreement import Disagreement
from .entropy import Entropy
from .fpr95 import FPR95
from .jensen_shannon_divergence import JensenShannonDivergence
from .mutual_information import MutualInformation
from .nll import GaussianNegativeLogLikelihood, NegativeLogLikelihood
from .variation_ratio import VariationRatio
121 changes: 0 additions & 121 deletions torch_uncertainty/metrics/jensen_shannon_divergence.py

This file was deleted.

3 changes: 2 additions & 1 deletion torch_uncertainty/metrics/mutual_information.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,8 @@ class MutualInformation(Metric):
Note:
A higher mutual information can be interpreted as a higher epistemic
uncertainty.
uncertainty. The Mutual Information is also computationally equivalent
to the Generalized Jensen-Shannon Divergence (GJSD).
Warning:
Make sure that the probabilities in :attr:`probs` are normalized to sum
Expand Down

0 comments on commit 2386247

Please sign in to comment.