Skip to content

Releases: awslabs/aws-deployment-framework

v2.1.0

27 Oct 18:47
ef6d7a4
Compare
Choose a tag to compare

v2.1.0

New Features:

  • Tag based deployment targets in deployment map files. As of 2.1.0 you can now use AWS Account Tags (defined in AWS Organizations) to dynamically build the stages of your pipelines. For more information see the section in the user guide titled Targeting via Tags.

Fixes: 🐛

  • CodeBuild stages as deployment targets now work as intended and will not cause naming clashes.
  • deployment_maps directory is not longer unintentionally removed on upgrade pull requests.
  • cdk and indirect modules all pinned to 1.13.1 which avoids the error output each time a new version is released.
  • cdk synth stdout no longer displayed in CodeBuild on the Deployment Account.

v2.0.1

15 Oct 07:48
a731537
Compare
Choose a tag to compare

v2.0.1 🐛

Bug Fixes

🐛 Pip was not being locked at a specific version which in turn caused sam build to break in the bootstrap repo pipeline.
🐛 Schema validation did not allow a list of account id's or paths to be used as a target.

v2.0.0

13 Oct 16:09
ef2ed64
Compare
Choose a tag to compare

v2.0.0 🚀

Release Notes:

  • Detailed Serverless Application Repository (SAR) Parameter Document that highlights what the parameters do prior to deploying ADF. This can be seen when viewing ADF via the SAR.
  • PollForChanges is no longer the default way to retrieve content from CodeCommit repositories. A Cloudwatch event is now created in source accounts that sends CodeCommit events towards the Deployment account via a cross account event bus policy. This allows for a pushed based model that is more reactive to changes and more predictable.
  • Secrets Manager has replaced Parameter Store for holding sensitive values related to Slack Notifications along with Github Tokens. For information on this see the admin guide or example deployment map which showcases the new syntax. breaking change
  • Secrets Manager now used for Slack API tokens (see admin guide for example). breaking change
  • Schema validation is in place for the deployment map files. (Please see the below section on updating the deployment map syntax) breaking change

Issues Resolved

Resolves #134 -> bug fixed with package transform script.
Resolves #133 -> template_filename is now a key that can be passed in at any stage to override the filename used.
Resolves #59 -> no more polling pipelines for changes, source accounts will get a Cloudwatch event stack that will send its codecommit events across to the deployment account via a cross account event bus policy.
Resolves #125 -> approval stages can now be passed in the middle of change sets as a deploy type key (see below examples).
Resolves #122 -> stack_name is now a key that can be used with the deploy type.
Resolves #56 -> no more Jinja2 for Pipeline generation. CDK has taken its place and allows more flexibility.

Jinja2 has been removed from ADF in regard to generating pipeline stacks in favour of the AWS CDK.

All pipelines are now generated with the AWS CDK. As part of this work we now have extended flexibility within deployment map files.

What changes do I need to make to my deployment map files when upgrading to 2.0.x?

We have made small syntactical changes to the way deployment map file(s) are laid out. Rather than defining a top level group of types such as 'cc-cloudformation' we now have the ability to break these into their own types for actions such as source, build, deploy and invoke. Taking a look at a pre 2.0 version of a pipeline in the deployment map.

  - name: sample-vpc
    type: cc-cloudformation
    params:
      - SourceAccountId: 111111111111
      - RestartExecutionOnUpdate: True
    targets:
      - 123456789101
      - /banking/production

This same pipeline as defined in 2.0 will look like this:

  - name: sample-vpc
    default_providers:
      source:
        provider: codecommit
        properties:
          account_id: 11111111111
    params:
        restart_execution_on_update: True
    targets:
      - 123456789101
      - /banking/production

We now have the ability to define different types for different stages within the pipeline which allows for more flexibility. These types can be defined at the top level of a pipeline under the type key, or defined at a target level of that targets array. Meaning you can have the first stage of a pipeline deploy CloudFormation and then the second stage be a Lambda function or any other supported CodePipeline provider. All params and type syntax are now all snake case and is of key/value pairs as opposed to items within a list using camel case syntax. The 'params' key is now specifically for pipeline parameters and does not couple in parameters related to source, build, test or deploy phases. We have also created a types guide to help show what keys are possible within specific stages in the map files.

Another example of a new 2.0 pipeline:

  - name: sample-iam
    default_providers:
      source:
        provider: codecommit
        properties:
          account_id: 11111111111
      build:
        provider: codebuild # CodeBuild is also the default if this key is omitted
      deploy:
        provider: cloudformation # CloudFormation is also the default if this key is omitted
    params:
        notification_endpoint: [email protected]
        restart_execution_on_update: True
    targets:
      - /banking/testing # will use deploy action from type defined above (CloudFormation)
      - path: /banking/production
        properties:
          stack_name: my-cool-iam-stack # Since CloudFormation is defined above as the deploy type, these target level keys are assumed to be CloudFormation properties.
          change_set_approval: True # Insert a approval in between create + execute change set.
      - provider: lambda # https://docs.aws.amazon.com/codepipeline/latest/userguide/actions-invoke-lambda-function.html
          properties:
            input: {"name": "jon_doe"} # This input will be passed to the function as a string
            function_name: my_lambda_function

The above example will create a Pipeline that Deploys CloudFormation for the first two stages and then invokes a Lambda function as the last stage. The Path/Target key is no longer mandatory as sometimes stages do not relate to an OU path or target AWS account. For more information on what keys are available within specific types see the types guide in the docs.

If you have any issues with restructuring to this new syntax for pipelines please create an issue on Github with an example of the pipeline definition.

Parameter Store swapped with Secrets Manager for Github oauth tokens

Github oauth tokens are now expected to be stored in Secrets Manager and fetched using the new syntax as follows:

  - name: sample-vpc
    default_providers:
      source:
        provider: github
        properties:
          repository: example-vpc-adf
          owner: bundyfx
          oauth_token_path: /adf/github_token # The path in AWS Secrets Manager that holds the GitHub Oauth token, ADF only has access to /adf/ prefix in Secrets Manager
          json_field: token # The field (key) name of the json object stored in AWS Secrets Manager that holds the Oauth token
      deploy:
        provider: cloudformation
        properties:
          action: replace_on_failure
    params:
        notification_endpoint: [email protected]
    targets:
      - path: /banking/testing
        name: fancy-name

In this above example you would store your token in the path /adf/github_token in secrets manager in a JSON object like {"token": "123mytoken"}

v1.2.7

20 Aug 07:33
12d285f
Compare
Choose a tag to compare

v.1.2.7

  • Minor 🐛fix - Parameter merging / ordering - Resolves #132

v1.2.6

17 Aug 13:46
11abaa2
Compare
Choose a tag to compare

v1.2.6

  • Resolves #132 🐛 - import:, upload: and resolve: should now correctly merge into the parameter file as intended.
  • Resolves #116 - Thanks @javydekoning - Can now pass in a repository name for Github as opposed to relying on the pipeline name.
  • Resolves #119 - Thanks @javydekoning - Parameters for removed pipelines now clean up correctly.
  • Resolves #126 - contains_transform works as intended with top level regions and target level regions
  • Cumulative set of rollup typo's and minor changes from v1.2.4 and v1.2.5 which were not official releases.

v1.2.2

29 Jul 10:04
d9f28c1
Compare
Choose a tag to compare

v1.2.2

Patch only release with single bug fix related to buildonly related pipelines that had an unassigned variable reference error.

v1.2.3

29 Jul 13:45
beeb613
Compare
Choose a tag to compare

v1.2.3

Patch release only to fix bug in initial commit lambda that causes the global.yml and regional.yml (in deployment folder) to not be commit to the bootstrap repository on first deploy. Causing the bootstrap pipeline to fail since it cannot find the global.yml.

v1.2.1

28 Jul 15:41
91cf4d2
Compare
Choose a tag to compare

v1.2.1 🚀

  • Resolves #105 - You can now pass in custom deployment roles for an entire pipeline or for specific steps in a pipeline, for the docs for more information on the deployment_role key within deployment map files.
  • Resolves #104 - You can now choose the type of url path that is returned when using the upload: intrinsic function. This can be either 'path' or 'virtual-hosted'. See the docs for the upload intrinsic function for more details.
  • Resolves #103 - You can now attach multiple pipelines to a single repository by passing in a RepositoryName parameter.
  • Resolves #85 - changes should not be made to global.yml or regional.yml between version upgrades. Unless there is some major changes to the deployment/global.yml, we will not suggest changes to this file in automated pull requests when deploying via the SAR.
  • Resolves #84 - We have now documented the delete scenario for ADF.
  • Resolves #77 - Pipelines are now generated in their own thread, making this process much faster.

Other Changes:

  • Updated user-guide and admin-guide accordingly (Highlighting YAML Anchors etc)
  • Allowing the ability to pass in a custom build role

v1.2.0

17 Jul 13:45
17d411e
Compare
Choose a tag to compare

v1.2.0

Resolves #89 - Custom deploy pipeline can now be used for custom deployments actions (Terraform etc) - Thanks @triha74
Resolves #88 - CodeCommit repo creation is now automated if using CodeCommit as a source - Thanks @triha74
Resolves #87 Correct region must be used when deploying via SAR
Resolves #76 We now support optional resolve and import intrinsic functions for values ending in '?' - For example: You can import a value that may or may not exist from another stack in another account, if the value is not exported (does not exist) on the target stack, this will return an empty string when using a ? at the end of the import/resolve string.

Description of changes:

A large change introduced in v1.2.0 is there is no longer a need for the adf-build folder on the deployment pipelines repository. This is now removed as part of the Pull Request the SAR deployment will make. This allows the repository for pipelines to stay rather slim and removes the need to merge multiple PR's (master and deployment code commit repos) each time there is a deployment to ADF specific logic.

Another change introduced in 1.2.0 is the ability to pass in a role of your choice to useas the deployment role as opposed to adf-cloudformation-deployment-role. Codepipeline will default back to this role if none is specified however this allows users to define custom roles if they desired and pass in the stage parameter DeployRole with a custom Role Name.

Bug Fixes

  • Fixed Bug that caused a 'None' entry for S3 Bucket and KMS Arn to be inserted into parameter store if no target regions were specified
  • Fixed Bug where KeyErrors were raised when using the Slack bootstrap notifications
  • Fixed Bug that causes PR's against the deployment pipelines repo to be inconsistent
  • Fixed Bug that caused CodeCommit commits to fail if more than 100 files were contained within the commit

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

v1.1.0

30 Jun 09:51
41ccf0c
Compare
Choose a tag to compare

v1.1.0 🚀

Resolves #78 - Multiple Deployment Map files are now possible thanks to @triha74 - If you wish to split your map into multiple smaller maps based on teams, types or services you can create additional map files in a folder called 'deployment_maps' in the pipelines repository.

Resolves #82 - Pipeline completion triggers are now available. This allows you to trigger (aka chaining) other pipelines based on the completion of a pipeline. See the docs for examples.

Resolves #73 Parameter files for CloudFormation in CodePipeline can now be written in yml (eg global.yml, account-blah.yml). this works in the same manner with inheritance as it does with json.

Description of changes:

New Functionality:

  • Completion triggers on pipelines are a new feature that act as a way to chain pipelines together and start the execution of a certain pipeline(s) based on one finishing.
  • ADF now supports multiple deployment_map files that live in a folder called deployment_maps on the pipelines repository (deployment account) which can help separate an reduce complexity in large environments. These files can be named anything as long as they end with .yml.s
  • CloudFormation parameter files can now be written in yml aswell as json.

Other Changes:

  • Deployment Account now has a CodeCommit role as part of its base stack. This allows the use of the deployment account as a Source account for pipelines.
  • Automatic updating of the KMS key parameter store value on target accounts (base stacks parameter) if the key was to change on the deployment account.