Skip to content

Releases: dgasmith/opt_einsum

v2.2.0

29 Aug 22:05
Compare
Choose a tag to compare

New features:

  • (#48) Intermediates can now be shared between contractions, see here for more details.
  • (#53) Intermediate caching is thread safe.

Enhancements:

  • (#48) Expressions are now mapped to non-unicode index set so that unicode input is support for all backends.
  • (#58) Adds tensorflow and theano with shared intermediates.

Bug fixes:

  • (#41) PyTorch indices are mapped back to a small a-z subset valid for PyTorch's einsum implementation.

v2.1.3

23 Aug 22:13
9ad4a75
Compare
Choose a tag to compare

Bug fixes:

  • Fixes unicode issue for large numbers of tensors in Python 2.7.
  • Fixes unicode install bug in README.md.

v2.1.2

15 Aug 20:36
7ef12a5
Compare
Choose a tag to compare

Bug Fixes:

  • Ensures versioneer.py is in MANIFEST.in for a clean pip install.

v2.1.1

14 Aug 22:03
3ea01c8
Compare
Choose a tag to compare

Bug Fixes:

  • Minor tweak to release procedure.

v2.1.0

14 Aug 21:09
36990db
Compare
Choose a tag to compare

opt_einsum continues to improve its support for additional backends beyond NumPy with PyTorch.

We have also published the opt_einsum package in the Journal of Open Source Software. If you use this package in your work, please consider citing us!

New features:

  • PyTorch backend support
  • Tensorflow eager-mode execution backend support

Enhancements:

  • Intermediate tensordot-like expressions are now ordered to avoid transposes.
  • CI now uses conda backend to better support GPU and tensor libraries.
  • Now accepts arbitrary unicode indices rather than a subset.
  • New auto path option which switches between optimal and greedy at four tensors.

Bug fixes:

  • Fixed issue where broadcast indices were incorrectly locked out of tensordot-like evaluations even after their dimension was broadcast.

v2.0.1

28 Jun 20:13
521fde9
Compare
Choose a tag to compare

opt_einsum is a powerful tensor contraction order optimizer for NumPy and related ecosystems.

New Features

  • Allows unlimited Unicode indices.
  • Adds a Journal of Open-Source Software paper.
  • Minor documentation improvements.

v2.0.0

17 May 13:49
1f71263
Compare
Choose a tag to compare

opt_einsum is a powerful tensor contraction order optimizer for NumPy and related ecosystems.

New Features

  • Expressions can be precompiled so that the expression optimization need not happen multiple times.
  • The greedy order optimization algorithm has been tuned to be able to handle hundreds of tensors in several seconds.
  • Input indices can now be unicode so that expressions can have many thousands of indices.
  • GPU and distributed computing backends have been added such as Dask, TensorFlow, CUPy, Theano, and Sparse.

Bug Fixes

  • A error effecting cases where opt_einsum mistook broadcasting operations for matrix multiply has been fixed.
  • Most error messages are now more expressive.

v1.0

15 Oct 01:47
Compare
Choose a tag to compare

Official 1.0 release.

Einsum is a very powerful function for contracting tensors of arbitrary dimension and index. However, it is only optimized to contract two terms at a time resulting in non-optimal scaling for contractions with many terms. Opt_einsum aims to fix this by optimizing the contraction order which can lead to arbitrarily large speed ups at the cost of additional intermediate tensors.

Opt_einsum is also implemented into the np.einsum function as of NumPy v1.12.

New BLAS support!

30 Jul 21:34
Compare
Choose a tag to compare

A large step towards to a full 1.0 release. BLAS usage is now automatically applied to all operations. Future releases will be more careful with regard to views and needless data copying.

Python 3 + `setup.py`

23 Mar 14:13
Compare
Choose a tag to compare

Adds Python 3 support in addition to installation through a setup.py command.