A Julia library for Tensor Networks. Tenet
can be executed both at local environments and on large supercomputers. Its goals are,
- Expressiveness Simple to use. 👶
- Flexibility Extend it to your needs. 🔧
- Performance Goes brr... fast. 🏎️
A video of its presentation at JuliaCon 2023 can be seen here:
- Optimized Tensor Network contraction, powered by
EinExprs
- Tensor Network slicing/cuttings
- Automatic Differentiation of TN contraction
- Distributed contraction
- Quantum Tensor Networks
- Matrix Product States (MPS)
- Matrix Product Operators (MPO)
- Tree Tensor Networks (TTN)
- Projected Entangled Pair States (PEPS)
- Multiscale Entanglement Renormalization Ansatz (MERA)
- Numerical Tensor Network algorithms
- Tensor Renormalization Group (TRG)
- Density Matrix Renormalization Group (DMRG)
- Local Tensor Network transformations
- Hyperindex converter
- Rank simplification
- Diagonal reduction
- Anti-diagonal gauging
- Column reduction
- Split simplification
- 2D & 3D visualization of large networks, powered by
Makie
- Translation from quantum circuits, powered by
Quac