Skip to content

Commit

Permalink
Update chen2019locality.markdown
Browse files Browse the repository at this point in the history
  • Loading branch information
learning2hash committed May 22, 2024
1 parent e5b9c1f commit 0ee25e8
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion _publications/chen2019locality.markdown
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,6 @@ year: 2019
bibkey: chen2019locality
additional_links:
- {name: "PDF", url: "https://papers.nips.cc/paper/9195-locality-sensitive-hashing-for-f-divergences-mutual-information-loss-and-beyond.pdf"}
tags: ["LSH", "NIPS"]
tags: ["LSH", "NeurIPS"]
---
Computing approximate nearest neighbors in high dimensional spaces is a central problem in large-scale data mining with a wide range of applications in machine learning and data science. A popular and effective technique in computing nearest neighbors approximately is the locality-sensitive hashing (LSH) scheme. In this paper, we aim to develop LSH schemes for distance functions that measure the distance between two probability distributions, particularly for f-divergences as well as a generalization to capture mutual information loss. First, we provide a general framework to design LHS schemes for f-divergence distance functions and develop LSH schemes for the generalized Jensen-Shannon divergence and triangular discrimination in this framework. We show a two-sided approximation result for approximation of the generalized Jensen-Shannon divergence by the Hellinger distance, which may be of independent interest. Next, we show a general method of reducing the problem of designing an LSH scheme for a Krein kernel (which can be expressed as the difference of two positive definite kernels) to the problem of maximum inner product search. We exemplify this method by applying it to the mutual information loss, due to its several important applications such as model compression.

0 comments on commit 0ee25e8

Please sign in to comment.