Releases: ivannz/cplxmodule
Keeping up with modern software
In 2022.06
major release we increases the minimal versions to python>=3.7
and pytorch>=1.8
. Although the modern torch
now natively supports complex dtypes, no transition has been made to use them as the new backend, and currently we still use the split representation with CR-calculus on top (see the discussions in issues #2 and #21 ).
The following features have been added:
- FIX: having a dunder-version in the root of the package is a the standard that should be upheld (issue #24)
- FIX: set the minimal python to
3.7
as pointed out in issue #24 - FIX: upgraded
.utils.spectrum
to new native torch complex backend (torch>=1.8
) - FIX: ensured ONNX support in PR #14
- ENH: implemented modulus-based maxpooling, requested in issue #17
- FIX: made
.Cplx
instancesdeepcopy
-able, fixing issue #18
The following cosmetic or repo-level modifications have been made:
Complex-valued Neural Networks and Variational Dropout
This is a nominal major release, as it increases the minimal pytorch version from 1.4 to 1.7.
The following features have been added:
- experimental ONNX support (pr #14)
The version has been bumped from 2020 to 2021 to reflect the new year.
Complex-valued Neural Networks and Variational Dropout
This is a minor mid-month release.
The following features were added:
- Complex Transposed Convolutions # 8, squeeze/unsqueeze methods for
Cplx
# 7, and support forview
andview_as
methods forCplx
# 6 by Hendrik Schröter - Tensor-to-Cplx converter layers for special torch format of complex tensors (last dim is exactly 2) see torch.fft
The following bugs were fixed:
- Shape mismatch in
nn.init.cplx_trabelsi_independent_
, which prevented it from working properly # 11
Complex-valued Neural Networks and Variational Dropout
This is a minor release, that adds support for 3d real- and complex-valued convolutions and Variational Dropout for them.
Complex-valued networks and Bayesian sparsificaiton methods
This release includes a fix that makes masked layers work in multi gpu setting, and a update to sparsity accounting.
Complex-valued networks and Bayesian sparsificaiton methods
An extension for torch
that adds basic building blocks for complex-valued neural networks with batch normalization and weight initialization. Provides an implementation of Real- and Complex-valued Bayesian sparisification techniques: Variational Dropout and Automatic Relevance Determination. Finally, contains a fully functional package for real- and complex- valued maskable layers.