Skip to content

Releases: zama-ai/concrete-ml

v1.7.0

30 Sep 09:05
v1.7.0
946cbab
Compare
Choose a tag to compare

Summary

Concrete ML 1.7 adds functionality to fine-tune LLMs and neural networks on encrypted data using low rank approximation parameter efficient fine-tuning. This allows users to securely outsource large weight matrix computations to remote servers while keeping a small set of private fine-tuned parameters locally. Additionally, this release includes GPU support, providing a 1-2x speed-up for large neural networks on server-grade GPUs like the NVIDIA H100. Concrete ML now also supports Python 3.11 and PyTorch 2.

What's Changed

New features

  • Lora fine-tuning in FHE with MLP tutorial and gpt2 use case example (#823) (4d2f2e6)
  • Add GPU support (#849) (945aead)
  • Add support for Python3.11 (#701) (819dca7)
  • Support for embedding layers (#778) (296bc8c)
  • Support for encrypted multiplication and division (#690) (a1bd9b8)
  • Upgrade PyTorch to 2.3.1, and Brevitas to 0.10 (#788) (c3d7c81)

Improvements

  • Relax Python version restrictions for deployment (#853) (040c308)
  • Remove Protobuf 2GB limit when checking ONNX (#811) (c8908fa)
  • Always using evaluation key compression (#726) (b4e1060)

Fixes

  • Fix dtype check in quantizer dequant (77ced60)
  • Dynamic import of transformers in hybrid model (829b68b)
  • Use correct torch version for Intel Mac (#798) (a8eab89)

Resources

  • Documentation:

v1.6.1

28 Jun 15:01
v1.6.1
Compare
Choose a tag to compare

Summary

Minor fixes.

Links

Docker Image: zamafhe/concrete-ml:v1.6.1
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.6.1
Documentation: https://docs.zama.ai/concrete-ml

v1.6.1

Fix

  • Dynamic import of transformers in hybrid model (270efbe)

v1.6.0

24 Jun 10:30
v1.6.0
Compare
Choose a tag to compare

Summary

Concrete ML 1.6 includes the following enhancements:

  • Latency improvements on large neural networks
  • Support for pre-trained tree-based models such as those trained using Federated Learning
  • Enhanced collaborative computation
    • Introduction of DataFrame schemas
    • Deployment of logistic regression training

What's Changed

New features

  • Enable non-interactive encrypted training for logistic regression (#660) (ec58bca)
  • Support pre-trained tree-based models using from_sklearn (5ca282b)
  • Add FHE training deployment (#665) (b718629)
  • Support approximate rounding to speed up neural networks (9ef890e)
  • Allow users to define a schema for dataframe encryption (‘ccd6641’)

Fixes

  • Fix fhe-training classes behavior (a88d704)
  • Update qgpt2_class.py to fix typo (d376d85)
  • Fix post-processing shape mismatches for linear models (#585) (b097022)
  • Disable overflow protection in rounding (4db0157)
  • Make skorch import fail without error (81de55c)

Improvements

  • Replace python release install with setup-python (899b9f1)
  • Add support to AvgPool's missing parameters (15a8340)

Resources

  • Documentation:

    • Add schema example for encrypted data-frames (#715) (b174509)
    • Add a license FAQ in README (#711) (f937c9f)
    • Update documentation on client / server API (#663) (f864407)
    • Document n_bits for compile torch functions (0306c65)
    • Add examples for the impact of feature scaling on linear models (9252f57)
  • Demo & Examples:

    • Add NN-20 and NN-50 deep MLPs for MNIST classification (1b5ce84)

v1.6.0-rc3

23 Jun 14:14
v1.6.0-rc3
Compare
Choose a tag to compare
v1.6.0-rc3 Pre-release
Pre-release

Summary

1.6.0 - Release Candidate - 3

v1.6.0-rc0

21 Jun 14:30
v1.6.0-rc0
Compare
Choose a tag to compare
v1.6.0-rc0 Pre-release
Pre-release

Summary

1.6.0 - Release Candidate - 0

v1.5.0

05 Apr 17:23
v1.5.0
Compare
Choose a tag to compare

Summary

Concrete ML v1.5 introduces several significant enhancements in this release, including a DataFrame API that enables working with encrypted stored data, a new option that speeds up neural networks by 2-3 times, an improved FHE simulation mode to quickly evaluate the impact of the speed-up on neural network accuracy.

What's Changed

Features:

  • Add encrypted dataframe API (d2d6250)
  • Add an option to allow approximate rounding to speed up NNs (9ef890e)
  • Support ONNX conv1d operator (09ad7a6)
  • Implement quantized unfold operation (fa3ef88)

Improvements:

  • More accurate FHE simulation
  • Encrypted aggregation of the outputs of tree ensembles
  • Allow different quantization bits for tree model leaves and inputs

Fix

  • Import skorch without errors due to bad docstrings (81de55c)
  • Add support to AvgPool's missing parameters (15a8340)

Resources

  • Documentation:
    • New structure and landing page (85cb962)
    • Add links to credit card approval space in use case examples (df81aca)
    • Improve contributing section (1696799)
    • Document n_bits for compile torch functions (0306c65)
    • Add explanation of encrypted training and federated learning (57dbdff)
    • Add documentation about scaling (9252f57)
    • Add dataframe documentation (#576) (d3bf5ac)

Links

Docker Image: zamafhe/concrete-ml:v1.5.0
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.5.0
Documentation: https://docs.zama.ai/concrete-ml

v1.5.0-rc1

03 Apr 16:36
v1.5.0-rc1
Compare
Choose a tag to compare
v1.5.0-rc1 Pre-release
Pre-release

Summary

Add support for encrypted data-frames and approximate rounding. Fix AvgPool's count_include_pad missing parameter error.

Links

Docker Image: zamafhe/concrete-ml:v1.5.0-rc1
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.5.0-rc1
Documentation: https://docs.zama.ai/concrete-ml

v1.5.0-rc1

Feature

Fix

  • Fix survey link (e5661c1)
  • Reinstate apidoc generated tags (2878a07)
  • Make skorch import fail without error (81de55c)
  • Replace python release install with setup-python (899b9f1)
  • Fix concurrency issue in release process (1295ea9)
  • Add support to AvgPool's missing parameters (15a8340)
  • Update README.md (ba1fdad)

Documentation

  • Add dataframe documentation (#576) (d3bf5ac)
  • Update main landing pages (cfb862e)
  • New structure and landing page (85cb962)
  • Update operator list in torch support's documentation section (b617740)
  • Add links to credit card approval space in use case examples (df81aca)
  • Improve contributing section (1696799)
  • Document n_bits for compile torch functions (0306c65)
  • Add explanation of encrypted training and federated learning (57dbdff)
  • Add documentation about scaling (9252f57)

v1.5.0-rc0

15 Feb 11:27
v1.5.0-rc0
Compare
Choose a tag to compare
v1.5.0-rc0 Pre-release
Pre-release

Summary

Add support to torch's Conv1d and Unfold operators

Links

Docker Image: zamafhe/concrete-ml:v1.5.0-rc0
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.5.0-rc0
Documentation: https://docs.zama.ai/concrete-ml

v1.5.0-rc0

Feature

  • Support conv1d operator (09ad7a6)
  • Implement quantized unfold (fa3ef88)

Fix

  • Make skorch import fail without error (81de55c)
  • Replace python release install with setup-python (899b9f1)
  • Fix concurrency issue in release process (1295ea9)
  • Add support to AvgPool's missing parameters (15a8340)
  • Update README.md (ba1fdad)

Documentation

  • Update operator list in torch support's documentation section (b617740)
  • Add links to credit card approval space in use case examples (df81aca)
  • Improve contributing section (1696799)
  • Document n_bits for compile torch functions (0306c65)
  • Add explanation of encrypted training and federated learning (57dbdff)
  • Add documentation about scaling (9252f57)

v1.4.1

15 Feb 13:13
v1.4.1
Compare
Choose a tag to compare

Summary

Update Concrete-Python to 2.5.1 and fixes AvgPool's missing parameters.

Links

Docker Image: zamafhe/concrete-ml:v1.4.1
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.4.1
Documentation: https://docs.zama.ai/concrete-ml

v1.4.1

Fix

  • Make skorch import fail without error (5863a4b)
  • Replace python release install with setup-python (f41c65c)
  • Update README.md (8bef8e5)
  • Add support to AvgPool's missing parameters (559d99c)

Documentation

  • Document n_bits for compile torch functions (065df89)
  • Add explanation of encrypted training and federated learning (8002ed8)
  • Add documentation about scaling (1bad5cc)

v1.4.0

11 Jan 17:31
v1.4.0
Compare
Choose a tag to compare

Summary

This release adds training a model on encrypted data and introduces latency optimization for the inference of tree-based models such as XGBoost, random forest, and decision trees. This optimization offers 2-3x speed-ups in typical quantization settings and allows even more accurate, high bit-width tree-based models to run with good latency.

Links

Docker Image: zamafhe/concrete-ml:v1.4.0
Docker Hub: https://hub.docker.com/r/zamafhe/concrete-ml/tags
pip: https://pypi.org/project/concrete-ml/1.4.0
Documentation: https://docs.zama.ai/concrete-ml

v1.4.0

Feature

  • SGDClassifier training in FHE (0893718)
  • Support Expand Equal ONNX op (cf3ce49)
  • Add rounding feature on cml trees (064eb82)
  • Add multi-output support (fef23a9)
  • Allow QuantizedAdd produces_output_graph (0b57c71)
  • Encrypted gemm support - 3d inputs - better rounding control - sgd training test (111c7e3)

Fix

  • Add --no-warnings flag to linkchecker (1dc547e)
  • Fix wrong assumption in ReduceSum operator's axis parameter (1a592d7)
  • Mark flaky tests due to issue in simulation (4f67883)
  • Update learning rate default value for XGB models (e4984d6)

Documentation

  • Improve XGBClassifier notebook (b1906a5)
  • Update api doc (d8e9e64)
  • Update Apple Silicon install information (4c0c02f)