v1.4.0
Summary
This release adds training a model on encrypted data and introduces latency optimization for the inference of tree-based models such as XGBoost, random forest, and decision trees. This optimization offers 2-3x speed-ups in typical quantization settings and allows even more accurate, high bit-width tree-based models to run with good latency.
Links
Docker Image: zamafhe/concrete-ml:v1.4.0
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.4.0
Documentation: https://docs.zama.ai/concrete-ml
v1.4.0
Feature
- SGDClassifier training in FHE (
0893718
) - Support Expand Equal ONNX op (
cf3ce49
) - Add rounding feature on cml trees (
064eb82
) - Add multi-output support (
fef23a9
) - Allow QuantizedAdd produces_output_graph (
0b57c71
) - Encrypted gemm support - 3d inputs - better rounding control - sgd training test (
111c7e3
)
Fix
- Add --no-warnings flag to linkchecker (
1dc547e
) - Fix wrong assumption in ReduceSum operator's axis parameter (
1a592d7
) - Mark flaky tests due to issue in simulation (
4f67883
) - Update learning rate default value for XGB models (
e4984d6
)