Table of contents
The Apache Airflow releases are one of the two types:
- Releases of the Apache Airflow package
- Releases of the Backport Providers Packages
This package contains sources that allow the user building fully-functional Apache Airflow 2.0 package. They contain sources for:
- "apache-airflow" python package that installs "airflow" Python package and includes all the assets required to release the webserver UI coming with Apache Airflow
- Dockerfile and corresponding scripts that build and use an official DockerImage
- Breeze development environment that helps with building images and testing locally apache airflow built from sources
In the future (Airflow 2.0) this package will be split into separate "core" and "providers" packages that will be distributed separately, following the mechanisms introduced in Backport Package Providers. We also plan to release the official Helm Chart sources that will allow the user to install Apache Airflow via helm 3.0 chart in a distributed fashion.
The Source releases are the only "official" Apache Software Foundation releases, and they are distributed via Official Apache Download sources
Following source releases Apache Airflow release manager also distributes convenience packages:
- PyPI packages released via https://pypi.org/project/apache-airflow/
- Docker Images released via https://hub.docker.com/repository/docker/apache/airflow
Those convenience packages are not "official releases" of Apache Airflow, but the users who cannot or do not want to build the packages themselves can use them as a convenient way of installing Apache Airflow, however they are not considered as "official source releases". You can read more details about it in the ASF Release Policy.
Detailed instruction of releasing Provider Packages can be found in the README_RELEASE_AIRFLOW.md
The Provider packages are packages (per provider) that make it possible to easily install Hooks, Operators, Sensors, and Secrets for different providers (external services used by Airflow).
There are also Backport Provider Packages that allow to use the Operators, Hooks, Secrets from the 2.0 version of Airflow in the 1.10.* series.
Once you release the packages, you can simply install them with:
pip install apache-airflow-providers-<PROVIDER>[<EXTRAS>]
for regular providers and
pip install apache-airflow-backport-providers-<PROVIDER>[<EXTRAS>]
for backport providers.
Where <PROVIDER>
is the provider id and <EXTRAS>
are optional extra packages to install.
You can find the provider packages dependencies and extras in the README.md files in each provider
package (in airflow/providers/<PROVIDER>
folder) as well as in the PyPI installation page.
Backport providers are a great way to migrate your DAGs to Airflow-2.0 compatible DAGs. You can switch to the new Airflow-2.0 packages in your DAGs, long before you attempt to migrate airflow to 2.0 line.
The sources released in SVN allow to build all the provider packages by the user, following the instructions and scripts provided. Those are also "official_source releases" as described in the ASF Release Policy and they are available via Official Apache Download for providers and Official Apache Download for backport-providers
The full provider's list can be found here: Provider Packages Reference
There are also convenience packages released as "apache-airflow-providers" and "apache-airflow-backport-providers" separately in PyPI. You can find all backport providers via: PyPI query for providers and PyPI query for backport providers.
Detailed instruction of releasing Provider Packages can be found in the README_RELEASE_PROVIDER_PACKAGES.md
The person acting as release manager has to fulfill certain pre-requisites. More details and FAQs are available in the ASF Release Policy but here some important pre-requisites are listed below. Note that release manager does not have to be a PMC - it is enough to be committer to assume the release manager role, but there are final steps in the process (uploading final releases to SVN) that can only be done by PMC member. If needed, the release manager can ask PMC to perform that final step of release.
Make sure your public key is on id.apache.org and in KEYS. You will need to sign the release artifacts with your pgp key. After you have created a key, make sure you:
- Add your GPG pub key to https://dist.apache.org/repos/dist/release/airflow/KEYS , follow the instructions at the top of that file. Upload your GPG public key to https://pgp.mit.edu
- Add your key fingerprint to https://id.apache.org/ (login with your apache credentials, paste your fingerprint into the pgp fingerprint field and hit save).
# Create PGP Key
gpg --gen-key
# Checkout ASF dist repo
svn checkout https://dist.apache.org/repos/dist/release/airflow
cd airflow
# Add your GPG pub key to KEYS file. Replace "Kaxil Naik" with your name
(gpg --list-sigs "Kaxil Naik" && gpg --armor --export "Kaxil Naik" ) >> KEYS
# Commit the changes
svn commit -m "Add PGP keys of Airflow developers"
See this for more detail on creating keys and what is required for signing releases.
http://www.apache.org/dev/release-signing.html#basic-facts
In order to not reveal your password in plain text, it's best if you create and configure API Upload tokens. You can add and copy the tokens here:
Create a ~/.pypirc
file:
[distutils]
index-servers =
pypi
pypitest
[pypi]
username=__token__
password=<API Upload Token>
[pypitest]
repository=https://test.pypi.org/legacy/
username=__token__
password=<API Upload Token>
Set proper permissions for the pypirc file:
chmod 600 ~/.pypirc
- Install twine if you do not have it already (it can be done in a separate virtual environment).
pip install twine
(more details here.)
- Set proper permissions for the pypirc file:
$ chmod 600 ~/.pypirc
The best way to prepare and verify the releases is to prepare them on a hardware owned and controlled by the committer acting as release manager. While strictly speaking, releases must only be verified on hardware owned and controlled by the committer, for practical reasons it's best if the packages are prepared using such hardware. More information can be found in this FAQ