Scikit-style PyTorch-autodiff multiparameter persistent homology python library.
This library aims to provide easy to use and performant strategies for applied multiparameter topology.
Meant to be integrated in the Gudhi library.
Source | Version | Downloads | Platforms |
---|---|---|---|
This library allows computing several representations from "geometrical datasets", e.g., point clouds, images, graphs, that have multiple scales. We provide some nice pictures in the documentation. A non-exhaustive list of features can be found in the Features section.
This library is available on pip and conda-forge for (reasonably up to date) Linux, macOS and Windows, via
pip install multipers
or
conda install multipers -c conda-forge
Windows support is experimental, and some core dependencies are not available on Windows.
We hence recommend Windows user to use WSL.
A documentation and building instructions are available
here.
This library features a bunch of different functions and helpers. See below for a non-exhaustive list.
Filled box refers to implemented or interfaced code.
- [Multiparameter Module Approximation] provides the multiparameter simplicial structure, as well as technics for approximating modules, via interval-decomposable modules. It is also very useful for visualization.
- [Stable Vectorization of Multiparameter Persistent Homology using Signed Barcodes as Measures, NeurIPS2023] provides fast representations of multiparameter persistence modules, by using their signed barcodes decompositions encoded into signed measures. Implemented decompositions : Euler surfaces, Hilbert function, rank invariant (i.e. rectangles). It also provides representation technics for Machine Learning, i.e., Sliced Wasserstein kernels, and Vectorizations.
- [A Framework for Fast and Stable Representations of Multiparameter Persistent Homology Decompositions, NeurIPS2023] Provides a vectorization framework for interval decomposable modules, for Machine Learning. Currently implemented as an extension of MMA.
- [Differentiability and Optimization of Multiparameter Persistent Homology, ICML2024] An approach to compute a (clarke) gradient for any reasonable multiparameter persistent invariant. Currently, any
multipers
computation is auto-differentiable using this strategy, provided that the input are pytorch gradient capable tensor. - [Multiparameter Persistence Landscapes, JMLR] A vectorization technic for multiparameter persistence modules.
- [Filtration-Domination in Bifiltered Graphs, ALENEX2023] Allows for 2-parameter edge collapses for 1-critical clique complexes. Very useful to speed up, e.g., Rips-Codensity bifiltrations.
- [Chunk Reduction for Multi-Parameter Persistent Homology, SOCG2019] Multi-filtration preprocessing algorithm for homology computations.
- [Computing Minimal Presentations and Bigraded Betti Numbers of 2-Parameter Persistent Homology, JAAG] Minimal presentation of multiparameter persistence modules, using mpfree. Hilbert, Rank Decomposition Signed Measures, and MMA decompositions can be computed using the mpfree backend.
- [Delaunay Bifiltrations of Functions on Point Clouds, SODA2024] Provides an alternative to function rips bifiltrations, using Delaunay complexes. Very good alternative to Rips-Density like bifiltrations.
- [Delaunay Core Bifiltration] Bifiltration for point clouds, taking into account the density. Similar to Rips-Density.
- [Rivet] Interactive two parameter persistence
- [Kernel Operations on the GPU, with Autodiff, without Memory Overflows, JMLR] Although not linked, at first glance, to persistence in any way, this library allows computing blazingly fast signed measures convolutions (and more!) with custom kernels.
- [Backend only] [Projected distances for multi-parameter persistence modules] Provides a strategy to estimate the convolution distance between multiparameter persistence module using projected barcodes. Implementation is a WIP.
- [Partial, and experimental] [Efficient Two-Parameter Persistence Computation via Cohomology, SoCG2023] Minimal presentations for 2-parameter persistence algorithm.
If I missed something, or you want to add something, feel free to open an issue.
David Loiseaux,
Hannah Schreiber (Persistence backend code),
Luis Scoccola
(Möbius inversion in python, degree-rips using persistable and RIVET),
Mathieu Carrière (Sliced Wasserstein),
Odin Hoff Gardå (Delaunay Core bifiltration).
Please cite this library when using it in scientific publications; you can use the following journal bibtex entry
@article{multipers,
title = {Multipers: {{Multiparameter Persistence}} for {{Machine Learning}}},
shorttitle = {Multipers},
author = {Loiseaux, David and Schreiber, Hannah},
year = {2024},
month = nov,
journal = {Journal of Open Source Software},
volume = {9},
number = {103},
pages = {6773},
issn = {2475-9066},
doi = {10.21105/joss.06773},
langid = {english},
}
Feel free to contribute, report a bug on a pipeline, or ask for documentation by opening an issue.
In particular, if you have a nice example or application that is not taken care in the documentation (see the ./docs/notebooks/
folder), please contact me to add it there.