Skip to content

Commit

Permalink
symetrical
Browse files Browse the repository at this point in the history
  • Loading branch information
Borda committed Sep 26, 2023
1 parent 6e896dd commit 43b4d3a
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion src/torchmetrics/functional/regression/kl_divergence.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,7 @@ def kl_divergence(
Where :math:`P` and :math:`Q` are probability distributions where :math:`P` usually represents a distribution
over data and :math:`Q` is often a prior or approximation of :math:`P`. It should be noted that the KL divergence
is a non-symetrical metric i.e. :math:`D_{KL}(P||Q) \neq D_{KL}(Q||P)`.
is a non-symmetrical metric i.e. :math:`D_{KL}(P||Q) \neq D_{KL}(Q||P)`.
Args:
p: data distribution with shape ``[N, d]``
Expand Down
2 changes: 1 addition & 1 deletion src/torchmetrics/regression/kl_divergence.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ class KLDivergence(Metric):
Where :math:`P` and :math:`Q` are probability distributions where :math:`P` usually represents a distribution
over data and :math:`Q` is often a prior or approximation of :math:`P`. It should be noted that the KL divergence
is a non-symetrical metric i.e. :math:`D_{KL}(P||Q) \neq D_{KL}(Q||P)`.
is a non-symmetrical metric i.e. :math:`D_{KL}(P||Q) \neq D_{KL}(Q||P)`.
As input to ``forward`` and ``update`` the metric accepts the following input:
Expand Down

0 comments on commit 43b4d3a

Please sign in to comment.