Skip to content

Latest commit

 

History

History
 
 

deploy

Deployment of htsget-lambda

The htsget-lambda crate is a cloud-based implementation of htsget-rs. It uses AWS Lambda as the ticket server, and AWS S3 as the data block server.

This is an example that deploys htsget-lambda using aws-cdk. It is deployed as an AWS HTTP API Gateway Lambda proxy integration. The stack uses RustFunction in order to integrate htsget-lambda with API Gateway. It uses a JWT authorizer with AWS Cognito as the issuer, and routes the htsget-rs server with AWS Route 53.

Configuration

To configure the deployment change the config files in the config directory. There are two configuration files corresponding to the deployed environment. The config file used for deployment can be controlled by passing --context "env=dev" or --context "env=prod" to cdk. When no context parameter is supplied, the default context is dev.

These config files configure htsget-lambda. See htsget-config for a list of available configuration options.

There is additional config that modifies the AWS infrastructure surrounding htsget-lambda. This config is sourced from AWS System Manager Parameter Store. Parameters are fetched using StringParemeter.valueFromLookup which takes a parameter name. The properties in cdk.json parameter_store_names hold these parameter names, which correspond to the following values:

Property storing SSM parameter name Description of SSM parameter
arn_cert The ARN for the ACM certificate of the htsget domain.
jwt_aud The JWT audience for the token.
cog_user_pool_id The Cognito user pool id which controls authorization.
htsget_domain The domain name for the htsget server.
hosted_zone_id The Route 53 hosted zone id for the htsget server.
hosted_zone_name The Route 53 hosted zone name for the htsget server.

Modify these properties to change which parameter the values come from. SSM parameters can also be cached locally using the CDK runtime context.

Deploying

Prerequisites

  • aws-cli should be installed and authenticated in the shell.
  • Node.js and npm should be installed.
  • Rust should be installed.

After installing the basic dependencies, complete the following steps:

  1. Add the arm cross-compilation target to rust.
  2. Install Zig using one of the methods show in getting started, or by running the commands below and following the prompts. Zig is used by cargo-lambda for cross-compilation.
  3. Install cargo-lambda, as it is used to compile artifacts that are uploaded to aws lambda.
  4. Install packages from this directory and compile htsget-lambda. This should place artifacts compiled for arm64 under the target/lambda directory which can be deployed to AWS.

Below is a summary of commands to run in this directory:

rustup target add aarch64-unknown-linux-gnu
cargo install cargo-lambda
npm install

cd ..
cargo lambda build --release --arm64 --bin htsget-lambda --features s3-storage
cd deploy

Deploy to AWS

CDK will run many of the commands above again. However, it is recommended to run them once before trying the commands below, to ensure that prerequisites are met.

CDK should be bootstrapped once, if this hasn't been done before.

npx cdk bootstrap

In order to deploy, check that the stack synthesizes correctly and then deploy.

npx cdk synth
npx cdk deploy

Testing the endpoint

When the deployment is finished, the htsget endpoint can be tested by querying it. Since a JWT authorizer is used, a valid JWT token must be obtained in order to access the endpoint. This token should be obtained from AWS Cognito using the configured user pool id and audience parameters. Then curl can be used to query the endpoint:

curl -H "Authorization: <JWT Token>" "https://<htsget_domain>/reads/service-info"

With a possible output:

{
  "id": "",
  "name": "",
  "version": "",
  "organization": {
    "name": "",
    "url": ""
  },
  "type": {
    "group": "",
    "artifact": "",
    "version": ""
  },
  "htsget": {
    "datatype": "reads",
    "formats": ["BAM", "CRAM"],
    "fieldsParametersEffective": false,
    "TagsParametersEffective": false
  },
  "contactUrl": "",
  "documentationUrl": "",
  "createdAt": "",
  "UpdatedAt": "",
  "environment": ""
}

Local testing

The Lambda function can also be run locally using cargo-lambda. From the root project directory, execute the following command.

cargo lambda watch

Then in a separate terminal session run.

cargo lambda invoke htsget-lambda --data-file data/events/event_get.json

Examples of different Lambda events are located in the data/events directory.