Skip to content

Commit

Permalink
ci(docker): use gcp artifact-registry
Browse files Browse the repository at this point in the history
  • Loading branch information
Henry Lee committed Aug 31, 2024
1 parent 011209e commit 534a055
Show file tree
Hide file tree
Showing 4 changed files with 49 additions and 46 deletions.
68 changes: 25 additions & 43 deletions .github/workflows/dockerimage.yml
Original file line number Diff line number Diff line change
@@ -1,65 +1,47 @@
name: Docker Image CI

on:
push:
branches: [ master, prod ]
pull_request:
branches: [ master, prod ]
env:
RC_NAME: davidtnfsh/pycon_etl

RC_NAME: asia-east1-docker.pkg.dev/${{ secrets.GCP_PROJECT_ID }}/data-team/pycon-etl
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Login to docker hub
uses: actions-hub/docker/login@master
env:
DOCKER_USERNAME: ${{ secrets.DOCKER_USERNAME }}
DOCKER_PASSWORD: ${{ secrets.DOCKER_PASSWORD }}

- uses: actions/checkout@v4
- name: Authenticate to Google Cloud
uses: google-github-actions/auth@v1
with:
credentials_json: ${{ secrets.GCP_SERVICE_ACCOUNT_KEY }}
- name: Configure docker to use gcloud command-line tool as a credential helper
run: |
gcloud auth configure-docker
- name: Pull cache
run: |
docker login -u ${{ secrets.DOCKER_USERNAME }} -p ${{ secrets.DOCKER_PASSWORD }}
docker pull ${RC_NAME}:cache
docker pull ${RC_NAME}:cache || true
- name: Build the Docker image
if: always()
run: |
docker build -t ${RC_NAME}:${GITHUB_SHA} --cache-from ${RC_NAME}:cache .
docker tag ${RC_NAME}:${GITHUB_SHA} ${RC_NAME}:cache
docker build -t ${RC_NAME}:cache --cache-from ${RC_NAME}:cache .
docker build -t ${RC_NAME}:test --cache-from ${RC_NAME}:cache -f Dockerfile.test .
docker tag ${RC_NAME}:${GITHUB_SHA} ${RC_NAME}:staging
docker tag ${RC_NAME}:${GITHUB_SHA} ${RC_NAME}:latest
- name: Run test
run: |
docker run -d --rm -p 8080:8080 --name airflow -v $(pwd)/dags:/usr/local/airflow/dags -v $(pwd)/fixtures:/usr/local/airflow/fixtures ${RC_NAME}:test webserver
docker run -d --rm -p 8080:8080 --name airflow -v $(pwd)/dags:/opt/airflow/dags -v $(pwd)/fixtures:/opt/airflow/fixtures ${RC_NAME}:test webserver
sleep 10
docker exec airflow bash -c "airflow test OPENING_CRAWLER_V1 CRAWLER 2020-01-01"
docker exec airflow bash -c "airflow test QUESTIONNAIRE_2_BIGQUERY TRANSFORM_data_questionnaire 2020-09-29"
- name: Push Cache to docker registry
uses: actions-hub/docker@master
if: always()
with:
args: push ${RC_NAME}:cache

- name: Push GITHUB_SHA to docker registry
uses: actions-hub/docker@master
if: always()
with:
args: push ${RC_NAME}:${GITHUB_SHA}

- name: Push staging to docker registry
uses: actions-hub/docker@master
if: ${{ github.ref == 'refs/heads/master' }} && success()
with:
args: push ${RC_NAME}:staging

- name: Push prod version to docker registry
uses: actions-hub/docker@master
- name: Push cache to Google Container Registry
if: success()
run: |
docker push ${RC_NAME}:cache
- name: Push staging to Google Container Registry
if: github.ref == 'refs/heads/master' && success()
run: |
docker tag ${RC_NAME}:cache ${RC_NAME}:staging
docker push ${RC_NAME}:staging
- name: Push prod version to Google Container Registry
if: github.ref == 'refs/heads/prod' && success()
with:
args: push ${RC_NAME}:latest
run: |
docker tag ${RC_NAME}:cache ${RC_NAME}:latest
docker push ${RC_NAME}:latest
23 changes: 22 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ Below are the steps to manage a virtual environment using `venv`:
### BigQuery (Optional)

Set up the Authentication for GCP: <https://googleapis.dev/python/google-api-core/latest/auth.html>
* After running `gcloud auth application-default login`, you will get a credentials.json file located at `$HOME/.config/gcloud/application_default_credentials.json`. Run `export GOOGLE_APPLICATION_CREDENTIALS="/path/to/keyfile.json"` if you have it.
*After running `gcloud auth application-default login`, you will get a credentials.json file located at `$HOME/.config/gcloud/application_default_credentials.json`. Run `export GOOGLE_APPLICATION_CREDENTIALS="/path/to/keyfile.json"` if you have it.
* service-account.json: Please contact @david30907d via email or Discord. You do not need this json file if you are running the sandbox staging instance for development.

## Running the Project
Expand Down Expand Up @@ -111,6 +111,27 @@ make down-dev

> The difference between production and dev/test compose files is that the dev/test compose file uses a locally built image, while the production compose file uses the image from Docker Hub.

If you are a authorized maintainer, you can pull the image from the GCP Artifact Registry.

Docker client must be configured to use the GCP Artifact Registry.

```bash
gcloud auth configure-docker asia-east1-docker.pkg.dev
```

Then, pull the image:

```bash
docker pull asia-east1-docker.pkg.dev/pycontw-225217/data-team/pycon-etl:{tag}
```

There are several tags available:

- `cache`: cache the image for faster deployment
- `test`: for testing purposes, including the test dependencies
- `staging`: when pushing to the staging environment
- `latest`: when pushing to the production environment

### Production

Please check the [Production Deployment Guide](./docs/DEPLOYMENT.md).
Expand Down
2 changes: 1 addition & 1 deletion docker-compose-dev.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ version: '3.4'

x-docker-common: &docker-common
env_file: .env.staging
image: pycon_etl
image: pycon-etl
build:
context: .
dockerfile: Dockerfile.test
Expand Down
2 changes: 1 addition & 1 deletion docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ version: '3.4'

x-docker-common: &docker-common
env_file: .env.production
image: davidtnfsh/pycon_etl:prod
image: asia-east1-docker.pkg.dev/pycontw-225217/data-team/pycon-etl:lastest
volumes:
- ./service-account.json:/opt/airflow/service-account.json
- airflow-db-volume:/opt/airflow/
Expand Down

0 comments on commit 534a055

Please sign in to comment.