Skip to content

Latest commit

 

History

History
154 lines (94 loc) · 5.59 KB

File metadata and controls

154 lines (94 loc) · 5.59 KB

How to deploy fabric8-analytics services on OpenShift

Install required tools

Use your preferred package manager to install aws-cli, psql, origin-clients and pwgen.

If you are running Fedora, then following command will do the trick:

$ sudo dnf install awscli pwgen postgresql origin-clients

Mac users will also need to install gawk from brew.

If you are running Mac, then following commands will do the trick:

$ brew install awscli
$ brew install postgres
$ brew install openshift-cli
$ brew install pwgen

For RedHat Employees, Please refer Requesting AWS Access

SIGNING IN:

https://devtools-dev.ext.devshift.net:8443/console/catalog with your Github Account

In dev_console overview, Please make sure all your services are up and running with at least 1 POD instance.

Configure fabric8-analytics services

The deploy.sh script expects to find configuration in env.sh file. The easiest way how to create the configuration file is to copy env-template.sh and modify it.

$ cd openshift
$ cp env-template.sh env.sh
$ vim env.sh

Editing env.sh to add required credentials and tokens.

  • Get OC_TOKEN from top-right Dropdown Menu in dev_console by clicking "Copy Login Command"
  • Get AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY from AWS Console
  • Generate new and set RDS_PASSWORD
  • GITHUB_API_TOKENS from https://github.com/settings/tokens
  • LIBRARIES_IO_TOKEN from https://libraries.io/account
  • For GITHUB_OAUTH_CONSUMER_KEY and GITHUB_OAUTH_CONSUMER_SECRET, refer comments

For others Keys/Values refer Comments in env.sh

For Red Hatters: If your kerboros_id and github username is different: Set OC_PROJECT="[your_kerboros_id]-fabric8-analytics"

Deploy fabric8-analytics services

Just run the deploy script and enjoy!

$ ./deploy.sh

If you have already run the script previously and therefore there exists a $OC_PROJECT project, the script purges it to start from scratch. If you want to also purge previously allocated AWS resources (RDS db, SQS queues, S3 buckets, DynamoDB tables) use

$ ./deploy.sh --purge-aws-resources

Once you know that you no longer need the fabric8-analytics deployment, you can run

$ ./cleanup.sh

to remove the OpenShift project and all allocated AWS resources.

FAQ's:

  1. In dev_console, bayesian-data-importer service is down.

cause: In this case you have some data messed up in your Dynamo DB Tables. resolution: Completely remove your tables only from AWS Dynamo DB, i.e tables prefixed with name your_kerboros_*. Redeploy bayesian-gremlin-http to recreate tables and then redeploy bayesian-data-importer.

Test not-yet-merged changes

Build in CI

Assume you have opened a PR in one of the fabric8-analytics repositories. Once tests are green, CentosCI will build your image and comment on the PR:

Your image is available in the registry: docker pull registry.devshift.net/fabric8-analytics/worker-scaler:SNAPSHOT-PR-25

To update your dev deployment to use the above mentioned image you can use one the following ways:

  • oc edit from command line
  • editor in web interface: Applications -> Deployments -> select deployment -> Actions -> Edit YAML
  • edit deploy.sh, add "-p IMAGE_TAG=SNAPSHOT-PR-25" (with correct tag) to corresponding oc_process_apply call at the end of the file and (re-)run ./deploy.sh.

Build in OpenShift

Update configure_os_builds.sh remotes value should contain your github accout name. Local variable templates define all the repositories that will be cloned and build using openshift docker build.

Update deployments to use imagestreams

After sucessfull build of all required images user needs to update all deployments to use newly build streams

E2E test

Configure OSIO token

If you want to run E2E tests, you will need to configure RECOMMENDER_API_TOKEN variable in your env.sh file. You can get the token on your openshift.io profile page after clicking on the "Update Profile" button.

Run E2E tests against your deployment

First clone E2E tests (git clone [email protected]:fabric8-analytics/fabric8-analytics-common.git) repository, if you haven't done so already.

Then prepare your environment (you'll need your API token for this, see the previous section):

source env.sh

And finally run the tests in the same terminal window:

cd fabric8-analytics-common/integration-tests/
./runtest.sh

Dockerized deployment scripts

There's also Dockerfile and Makefile to run these scripts in docker container to avoid installing the required tools. Just prepare your env.sh and run

  • make deploy to (re-)deploy to Openshift
  • make clean-deploy to purge fabric8-analytics project from Openshift along with allocated AWS resources and (re-)deploy
  • make clean to remove fabric8-analytics project from Openshift along with allocated AWS resources