-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
E2e bq test neeraj #18
base: develop
Are you sure you want to change the base?
Changes from all commits
b5895b3
e6789b0
e92df59
92bd3e8
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -161,7 +161,7 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data | |
Then Enter BigQuery sink property table name | ||
Then Toggle BigQuery sink property truncateTable to true | ||
Then Toggle BigQuery sink property updateTableSchema to true | ||
Then Enter BigQuery sink property partition field "bqPartitionFieldTime" | ||
Then Enter BigQuery sink property partition field "transaction_date" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Connect source as "BigQuery" and sink as "BigQuery" to establish connection | ||
|
@@ -262,3 +262,246 @@ Feature: BigQuery source - Verification of BigQuery to BigQuery successful data | |
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table | ||
|
||
|
||
@BQ_UPSERT_SOURCE_TEST @BQ_UPSERT_SINK_TEST | ||
Scenario: Verify scenario form BigQuery To to ensure that upsert operations are performed without updating the destination table | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rephrase the scenario outline and use the dedupe word also in the scenario to make it more undestandable. |
||
Given Open Datafusion Project to configure pipeline | ||
When Expand Plugin group in the LHS plugins list: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "BigQuery" from the plugins list as: "Sink" | ||
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection | ||
Then Navigate to the properties page of plugin: "BigQuery" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We have to use the use connection step in all the scenarios as per ITN class. For more info we can connect on this |
||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "datasetProject" with value: "projectId" | ||
Then Enter input plugin property: "referenceName" with value: "BQReferenceName" | ||
Then Enter input plugin property: "dataset" with value: "dataset" | ||
Then Enter input plugin property: "table" with value: "bqTargetTable" | ||
And Select radio button plugin property: "operation" with value: "upsert" | ||
Then Enter Value for plugin property table key : "relationTableKey" with values: "string_value" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Provide the values in plugin parameter file.Dont hardcode it here. |
||
Then Select dropdown plugin property: "dedupeBy" with option value: "DESC" | ||
Then Enter key for plugin property: "dedupeBy" with values: "float_value" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Provide the values in plugin parameter file.Dont hardcode it here. |
||
Then Click plugin property: "updateTableSchema" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Scenario is for without updating the table schema. Remove this step. |
||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove the extra lines. |
||
|
||
|
||
@BQ_UPDATE_SOURCE_TEST @BQ_UPDATE_SINK_TEST | ||
Scenario: Verify form BigQuery To to ensure that update operations are performed and the ensure that the duplicate entries has been removed to sink. | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rephrase this scenario, its not giving the info properly. Update as per ITN class |
||
Given Open Datafusion Project to configure pipeline | ||
When Expand Plugin group in the LHS plugins list: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "BigQuery" from the plugins list as: "Sink" | ||
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection | ||
Then Navigate to the properties page of plugin: "BigQuery" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Scenario need to be created using use connection as per ITN class |
||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "datasetProject" with value: "projectId" | ||
Then Enter input plugin property: "referenceName" with value: "BQReferenceName" | ||
Then Enter input plugin property: "dataset" with value: "dataset" | ||
Then Enter input plugin property: "table" with value: "bqTargetTable" | ||
And Select radio button plugin property: "operation" with value: "update" | ||
Then Enter Value for plugin property table key : "relationTableKey" with values: "string_value" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Value should be plugin parameter file. |
||
Then Select dropdown plugin property: "dedupeBy" with option value: "DESC" | ||
Then Enter key for plugin property: "dedupeBy" with values: "float_value" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Value should be plugin parameter file. |
||
Then Click plugin property: "updateTableSchema" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove extra lines. |
||
|
||
@BQ_NULL_MODE_SOURCE_TEST @BQ_SINK_TEST | ||
Scenario: Validate Successful record BigQuery source plugin with all NULL values in one column and Few NULL value in different cloumn. | ||
Given Open Datafusion Project to configure pipeline | ||
When Expand Plugin group in the LHS plugins list: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. We have to re-write all the scenarios as per use connection... |
||
When Select plugin: "BigQuery" from the plugins list as: "Sink" | ||
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection | ||
Then Navigate to the properties page of plugin: "BigQuery" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "datasetProject" with value: "projectId" | ||
Then Enter input plugin property: "referenceName" with value: "BQReferenceName" | ||
Then Enter input plugin property: "dataset" with value: "dataset" | ||
Then Enter input plugin property: "table" with value: "bqTargetTable" | ||
Then Click plugin property: "updateTableSchema" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table | ||
|
||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove extra lines |
||
|
||
|
||
@BQ_TIME_STAMP_SOURCE_TEST @BQ_SINK_TEST | ||
Scenario: Verify record insert from source BigQuery plugin with partition type Time (Date/timestamp/datetime). | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rephrase all the scenarios as per ITN class to make them more understandable.Take reference from ITN class or from alreday raised PR of Bigquery |
||
Given Open Datafusion Project to configure pipeline | ||
When Expand Plugin group in the LHS plugins list: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "BigQuery" from the plugins list as: "Sink" | ||
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection | ||
Then Navigate to the properties page of plugin: "BigQuery" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "datasetProject" with value: "projectId" | ||
Then Enter input plugin property: "referenceName" with value: "BQReferenceName" | ||
Then Enter input plugin property: "dataset" with value: "dataset" | ||
Then Enter input plugin property: "table" with value: "bqTargetTable" | ||
Then Enter input plugin property: "partitionByField" with value: "partiontion_by_field_value" | ||
Then Click plugin property: "updateTableSchema" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
#Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why this step is commented? Remove extra lines. |
||
|
||
|
||
@BQ_INSERT_SOURCE_TEST @BQ_INSERT_SECOND_SOURCE_TEST @BQ_SINK_TEST | ||
Scenario: Verify BigQuery With Different Schema RecordName | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Rephrase the scenario. |
||
Given Open Datafusion Project to configure pipeline | ||
When Expand Plugin group in the LHS plugins list: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Select plugin: "BigQuery" from the plugins list as: "Source" | ||
When Expand Plugin group in the LHS plugins list: "Transform" | ||
When Select plugin: "Wrangler" from the plugins list as: "Transform" | ||
When Select plugin: "Wrangler" from the plugins list as: "Transform" | ||
Then Move plugins: "Wrangler" by xOffset 250 and yOffset 300 | ||
Then Move plugins: "BigQuery" by xOffset 100 and yOffset 200 | ||
Then Connect plugins: "BigQuery" and "Wrangler2" to establish connection | ||
Then Connect plugins: "BigQuery2" and "Wrangler" to establish connection | ||
Then Connect plugins: "Wrangler" and "Wrangler2" to establish connection | ||
When Expand Plugin group in the LHS plugins list: "Sink" | ||
When Select plugin: "BigQuery" from the plugins list as: "Sink" | ||
Then Connect plugins: "Wrangler2" and "BigQuery3" to establish connection | ||
Then Navigate to the properties page of plugin: "BigQuery2" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery2" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery" | ||
And Enter input plugin property: "referenceName" with value: "Reference" | ||
And Replace input plugin property: "project" with value: "projectId" | ||
And Enter input plugin property: "datasetProject" with value: "datasetprojectId" | ||
And Replace input plugin property: "dataset" with value: "dataset" | ||
And Enter input plugin property: "table" with value: "bqSourceTable" | ||
Then Click on the Get Schema button | ||
Then Validate "BigQuery" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Wrangler" | ||
Then Enter textarea plugin property: "directives" with value: "drop :TableName" | ||
Then Validate "Wrangler" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "Wrangler2" | ||
Then Enter textarea plugin property: "directives" with value: "rename :EmployeeID :EmpID;" | ||
Then Validate "Wrangler2" plugin properties | ||
And Close the Plugin Properties page | ||
Then Navigate to the properties page of plugin: "BigQuery3" | ||
Then Replace input plugin property: "project" with value: "projectId" | ||
Then Enter input plugin property: "datasetProject" with value: "projectId" | ||
Then Enter input plugin property: "referenceName" with value: "BQReferenceName" | ||
Then Enter input plugin property: "dataset" with value: "dataset" | ||
Then Enter input plugin property: "table" with value: "bqTargetTable" | ||
Then Click plugin property: "updateTableSchema" | ||
Then Validate "BigQuery" plugin properties | ||
Then Close the BigQuery properties | ||
Then Save the pipeline | ||
Then Preview and run the pipeline | ||
Then Wait till pipeline preview is in running state | ||
Then Open and capture pipeline preview logs | ||
Then Verify the preview run status of pipeline in the logs is "succeeded" | ||
Then Close the pipeline logs | ||
Then Close the preview | ||
Then Deploy the pipeline | ||
Then Run the Pipeline in Runtime | ||
Then Wait till pipeline is in running state | ||
Then Open and capture logs | ||
Then Verify the pipeline status is "Succeeded" | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove extra lines. |
||
|
||
# Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table | ||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Why this step is commented? |
||
|
||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Revert this change. This is I believe an existing code.