Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding eventbridge-pipes-ddbstream-sfn-terraform pattern #2445

Open
wants to merge 2 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
93 changes: 93 additions & 0 deletions eventbridge-pipes-ddbstream-sfn-terraform/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,93 @@
# DynamoDB Stream to Step Functions with EventBridge Pipes
ellisms marked this conversation as resolved.
Show resolved Hide resolved

This pattern shows how to use Amazon EventBridge Pipes to connect Amazon DynamoDB streams with AWS Step Functions and launch a state machine.

![Pipes diagram](./ArchDiagram.png)

Learn more about this pattern at Serverless Land Patterns: https://serverlessland.com/patterns/eventbridge-pipes-ddbstream-sfn-terraform

Important: this application uses various AWS services and there are costs associated with these services after the Free Tier usage - please see the [AWS Pricing page](https://aws.amazon.com/pricing/) for details. You are responsible for any AWS costs incurred. No warranty is implied in this example.

* [Create an AWS account](https://portal.aws.amazon.com/gp/aws/developer/registration/index.html) if you do not already have one and log in. The IAM user that you use must have sufficient permissions to make necessary AWS service calls and manage AWS resources.
* [AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/install-cliv2.html) installed and configured
* [Git Installed](https://git-scm.com/book/en/v2/Getting-Started-Installing-Git)
* [AWS Serverless Application Model](https://docs.aws.amazon.com/serverless-application-model/latest/developerguide/serverless-sam-cli-install.html) (AWS SAM) installed
* [Terraform](https://learn.hashicorp.com/tutorials/terraform/install-cli?in=terraform/aws-get-started) installed

## Deployment Instructions

1. Create a new directory, navigate to that directory in a terminal and clone the GitHub repository:
```
git clone https://github.com/aws-samples/serverless-patterns
```
2. Change directory to the pattern directory:
```
cd eventbridge-pipes-ddbstream-sfn-terraform
```
3. From the command line, initialize Terraform to download and install the providers defined in the configuration:
```
terraform init
```
4. From the command line, apply the configuration in the main.tf file:
```
terraform apply
```
5. During the prompts:
* Enter yes
6. Note the outputs from the deployment process. These contain the resource names and/or URLs which are used for testing.

## How it works

Previously, whenever you needed to send DynamoDB record changes through DynamoDB streams to Step Functions, you had to implement an AWS Lambda function to invoke a state machine because Amazon DynamoDB streams did not support AWS Step Functions as a direct target.

Now, you can directly integrate DynamoDB streams with AWS Step Functions.

## Testing

1. Stream logs from the Step Functions log group
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
1. Stream logs from the Step Functions log group
1. Stream logs from the Step Functions log group. Replace `<LogGroup Name>` with the `SFNLog` output value.


```
sam logs --cw-log-group <LogGroup Name> --tail
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add note to use SFNLog output in the command.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, I don't understand. What do you mean?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I mean add an instruction to use the value from SFNLog in the sam logs command (i.e. replace LogGroup Name with the value of SFNLog).

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The LogGroup name is dynamically generated using a combination of the Stack name and a randomly generated string. I would like to keep the naming convention generic.

```

2. The EventBridge Pipe is configured to filter on the NationalTeam value of "Argentina"
In another terminal, add an item to the DynamoDB stream which does match the filter.

```bash
aws dynamodb put-item \
--table-name WorldCup-DB \
--item PlayerName={S="Lionel Messi"},Nationality={S="Argentina"},GoalsScored={S="1"}
```

The Step Functions state machine is invoked and you should see the logs for the new execution.

Now add an item to the DynamoDB stream which doesn't match the filter.
```bash
aws dynamodb put-item \
--table-name WorldCupTable \
--item PlayerName={S="Sergy Gnabry"},Nationality={S="Germany"},GoalsScored={S="1"}
```

No Step Function state machine is invoked and you will not see any new logs.

## Cleanup
1. Change directory to the pattern directory:
```
cd eventbridge-pipes-ddbstream-sfn-terraform
```
2. Delete all created resources by terraform
```bash
terraform destroy
```
3. During the prompts:
* Enter yes
4. Confirm all created resources has been deleted
```bash
terraform show
```

----
Copyright 2024 Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: MIT-0

56 changes: 56 additions & 0 deletions eventbridge-pipes-ddbstream-sfn-terraform/example-pattern.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
{
"title": "DynamoDB Stream to Step Functions",
"description": "Invoke Step Functions workflow from a DynamoDB stream message",
"language": "Terraform",
"level": "200",
"framework": "Terraform",
"introBox": {
"headline": "Invoke a Step Functions workflow from a DynamoDB stream",
"text": [
"This sample project demonstrates how to invoke an AWS Step Functions state machine from DynamoDB Stream without using Lambda. ",
"This pattern deploys one Step Function, one DynamoDB with stream enabled and one EventBridge Pipe."
]
},
"gitHub": {
"template": {
"repoURL": "https://github.com/aws-samples/serverless-patterns/tree/main/eventbridge-pipes-ddbstream-sfn-terraform",
"templateURL": "serverless-patterns/eventbridge-pipes-ddbstream-sfn-terraform",
"projectFolder": "eventbridge-pipes-ddbstream-sfn-terraform",
"templateFile": "main.tf"
}
},
"resources": {
"bullets": [
{
"text": "Amazon EventBridge Pipes - connects sources to targets. Pipes are intended for point-to-point integrations between supported sources and targets, with support for advanced transformations and enrichment.",
"link": "https://docs.aws.amazon.com/eventbridge/latest/userguide/eb-pipes.html"
}
]
},
"deploy": {
"text": [
"terraform init",
"terraform apply"
]
},
"testing": {
"text": [
"See the GitHub repo for detailed testing instructions."
]
},
"cleanup": {
"text": [
"terraform destroy"
]
},
"authors": [
{
"name": "Oriol Matavacas",
"image": "https://togithub.s3.eu-west-1.amazonaws.com/Oriol.jpg",
"bio": "Oriol Matavacas is a Sr. Solutions Architect at AWS based in Barcelona. Oriol primarily supporting customers on the journey to the Cloud. He enjoys building new solutions with scalability, availability and easy to maintain by using serverless.",
"linkedin": "https://www.linkedin.com/in/oriol-matavacas-rodriguez-b165868a",
"twitter": ""
}
]
}

240 changes: 240 additions & 0 deletions eventbridge-pipes-ddbstream-sfn-terraform/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,240 @@
terraform {
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.63"
}
}

required_version = ">= 0.14.9"
}

# Fetching current Account ID and AWS region
data "aws_caller_identity" "current" {}
data "aws_region" "current" {}

variable "national_team" {
description = "National Team Name"
type = string
default = "Argentina"
}

#################################################################
# DynamoDB Table
#################################################################
# Creating the DynamoDB Table
resource "aws_dynamodb_table" "WorldCup-DB" {
name = "WorldCup-DB"
billing_mode = "PROVISIONED"
hash_key = "PlayerName"
stream_enabled = true
stream_view_type = "NEW_AND_OLD_IMAGES"

attribute {
name = "PlayerName"
type = "S"
}

read_capacity = 1
write_capacity = 1

}

#################################################################
# Step Functions - State Machine
#################################################################
# Creating the Step Functions Machine
resource "aws_sfn_state_machine" "WorldCup-SF_machine" {
name = "WorldCup-SF_machine"
role_arn = aws_iam_role.WorldCup-SFRole.arn
type = "EXPRESS"
definition = file("workflow/ddb-pipes-sfn.asl.json")

logging_configuration {
log_destination = "${aws_cloudwatch_log_group.WorldCup-SF_LogGroup.arn}:*"
include_execution_data = true
level = "ALL"
}

tags = {
Name = "WorldCup-SF_machine"
}

}

# Creating a CloudWatch Log Group for Step Functions logs
resource "aws_cloudwatch_log_group" "WorldCup-SF_LogGroup" {
name = "ddb-stream-pipes-sf/WorldCup-StateMachine"
retention_in_days = 30
}

# Creating necessary IAM roles and policies for the Step Functions
resource "aws_iam_role" "WorldCup-SFRole" {
name = "WorldCup-SFRole"

assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Principal = {
Service = "states.amazonaws.com"
}
Action = "sts:AssumeRole"
}
]
})

inline_policy {
name = "CloudWatchLogs"

policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = [
"logs:CreateLogDelivery",
"logs:GetLogDelivery",
"logs:UpdateLogDelivery",
"logs:DeleteLogDelivery",
"logs:ListLogDeliveries",
"logs:PutResourcePolicy",
"logs:DescribeResourcePolicies",
"logs:DescribeLogGroups"
]
Resource = "*"
}
]
})
}
}

#################################################################
# Event Bridge - Pipes
#################################################################
# Creating the Event Bridge - Pipes
resource "aws_pipes_pipe" "WorldCup-ddb_stream_to_sfn" {
name = "WorldCup-ddb_stream_to_sfn"
role_arn = aws_iam_role.WorldCup-event_bridge_pipes_role.arn
source = aws_dynamodb_table.WorldCup-DB.stream_arn

source_parameters {
filter_criteria {
filter {
pattern = jsonencode({
dynamodb = {
NewImage = {
Nationality = {
S = [
{
prefix = var.national_team
}
]
}
}
}
})
}
}

dynamodb_stream_parameters {
starting_position = "LATEST"
batch_size = 1
}
}

target = aws_sfn_state_machine.WorldCup-SF_machine.arn

target_parameters {
step_function_state_machine_parameters {
invocation_type = "FIRE_AND_FORGET"
}
}
}

# Creating necessary IAM roles and policies for the Step Functions
resource "aws_iam_role" "WorldCup-event_bridge_pipes_role" {
name = "WorldCup-event_bridge_pipes_role"

assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Principal = {
Service = "pipes.amazonaws.com"
}
Action = "sts:AssumeRole"
}
]
})

inline_policy {
name = "WorldCup-CloudWatchLogs"

policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
]
Resource = "*"
}
]
})
}

inline_policy {
name = "WorldCup-SourcePolicy"

policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = [
"dynamodb:DescribeStream",
"dynamodb:GetRecords",
"dynamodb:GetShardIterator",
"dynamodb:ListStreams"
]
Resource = aws_dynamodb_table.WorldCup-DB.stream_arn
}
]
})
}

inline_policy {
name = "WorldCup-ExecuteSFN"

policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Effect = "Allow"
Action = "states:StartExecution"
Resource = aws_sfn_state_machine.WorldCup-SF_machine.arn
}
]
})
}
}

#################################################################
# Outputs
#################################################################
# Displaying the values
output "DynamoDBSourceTableName" {
description = "DynamoDB Table Name"
value = aws_dynamodb_table.WorldCup-DB.name
}

output "SFNLog" {
description = "StepFunctions LogGroup Name"
value = aws_cloudwatch_log_group.WorldCup-SF_LogGroup.name
}
Loading