Skip to content

Commit e16d211

Browse files
itsmekumaribharatgulati
authored andcommitted
e2e_PostgreSQL_Test scenarios
1 parent 001496a commit e16d211

27 files changed

+1935
-104
lines changed
Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,70 @@
1+
#
2+
# Copyright © 2023 Cask Data, Inc.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
# use this file except in compliance with the License. You may obtain a copy of
6+
# the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
# License for the specific language governing permissions and limitations under
14+
# the License.
15+
#
16+
17+
@PostgreSQL_Sink
18+
Feature: PostgreSQL sink - Verify PostgreSQL sink plugin design time scenarios
19+
20+
Scenario: To verify PostgreSQL sink plugin validation with connection and basic details for connectivity
21+
Given Open Datafusion Project to configure pipeline
22+
When Expand Plugin group in the LHS plugins list: "Sink"
23+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
24+
Then Navigate to the properties page of plugin: "PostgreSQL"
25+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
26+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
27+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
28+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
29+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
30+
Then Enter input plugin property: "referenceName" with value: "targetRef"
31+
Then Replace input plugin property: "database" with value: "databaseName"
32+
Then Replace input plugin property: "tableName" with value: "targetTable"
33+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
34+
Then Validate "PostgreSQL" plugin properties
35+
Then Close the Plugin Properties page
36+
37+
Scenario: To verify PostgreSQL sink plugin validation with connection arguments
38+
Given Open Datafusion Project to configure pipeline
39+
When Expand Plugin group in the LHS plugins list: "Sink"
40+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
41+
Then Navigate to the properties page of plugin: "PostgreSQL"
42+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
43+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
44+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
45+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
46+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
47+
Then Enter input plugin property: "referenceName" with value: "targetRef"
48+
Then Replace input plugin property: "database" with value: "databaseName"
49+
Then Replace input plugin property: "tableName" with value: "targetTable"
50+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
51+
Then Enter key value pairs for plugin property: "connectionArguments" with values from json: "connectionArgumentsList"
52+
Then Validate "PostgreSQL" plugin properties
53+
Then Close the Plugin Properties page
54+
55+
Scenario: To verify PostgreSQL sink plugin validation with advanced details with connection timeout
56+
Given Open Datafusion Project to configure pipeline
57+
When Expand Plugin group in the LHS plugins list: "Sink"
58+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
59+
Then Navigate to the properties page of plugin: "PostgreSQL"
60+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
61+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
62+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
63+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
64+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
65+
Then Enter input plugin property: "referenceName" with value: "targetRef"
66+
Then Replace input plugin property: "database" with value: "databaseName"
67+
Then Replace input plugin property: "tableName" with value: "targetTable"
68+
Then Replace input plugin property: "connectionTimeout" with value: "connectionTimeout"
69+
Then Validate "PostgreSQL" plugin properties
70+
Then Close the Plugin Properties page
Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
#
2+
# Copyright © 2023 Cask Data, Inc.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
# use this file except in compliance with the License. You may obtain a copy of
6+
# the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
# License for the specific language governing permissions and limitations under
14+
# the License.
15+
#
16+
17+
@PostgreSQL_Sink
18+
Feature: PostgreSQL sink- Verify PostgreSQL sink plugin design time macro scenarios
19+
20+
Scenario: To verify PostgreSQL sink plugin validation with macro enabled fields for connection section
21+
Given Open Datafusion Project to configure pipeline
22+
When Expand Plugin group in the LHS plugins list: "Sink"
23+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
24+
Then Navigate to the properties page of plugin: "PostgreSQL"
25+
Then Click on the Macro button of Property: "jdbcPluginName" and set the value to: "postGreSQLDriverName"
26+
Then Click on the Macro button of Property: "host" and set the value to: "postGreSQLHost"
27+
Then Click on the Macro button of Property: "port" and set the value to: "postGreSQLPort"
28+
Then Click on the Macro button of Property: "user" and set the value to: "postGreSQLUser"
29+
Then Click on the Macro button of Property: "password" and set the value to: "postGreSQLPassword"
30+
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "postGreSQLConnectionArguments"
31+
Then Enter input plugin property: "referenceName" with value: "targetRef"
32+
Then Replace input plugin property: "database" with value: "databaseName"
33+
Then Replace input plugin property: "tableName" with value: "targetTable"
34+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
35+
Then Validate "PostgreSQL" plugin properties
36+
Then Close the Plugin Properties page
37+
38+
Scenario: To verify PostgreSQL sink plugin validation with macro enabled fields for basic section
39+
Given Open Datafusion Project to configure pipeline
40+
When Expand Plugin group in the LHS plugins list: "Sink"
41+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
42+
Then Navigate to the properties page of plugin: "PostgreSQL"
43+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
44+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
45+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
46+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
47+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
48+
Then Enter input plugin property: "referenceName" with value: "targetRef"
49+
Then Replace input plugin property: "database" with value: "databaseName"
50+
Then Click on the Macro button of Property: "tableName" and set the value to: "postGreSQLTableName"
51+
Then Click on the Macro button of Property: "dbSchemaName" and set the value to: "postGreSQLSchemaName"
52+
Then Validate "PostgreSQL" plugin properties
53+
Then Close the Plugin Properties page
Lines changed: 146 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
#
2+
# Copyright © 2023 Cask Data, Inc.
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
5+
# use this file except in compliance with the License. You may obtain a copy of
6+
# the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
12+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
13+
# License for the specific language governing permissions and limitations under
14+
# the License.
15+
#
16+
17+
@PostgreSQL_Sink
18+
Feature: PostgreSQL - Verify data transfer from BigQuery source to PostgreSQL sink
19+
20+
@BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @Plugin-1526
21+
Scenario: To verify data is getting transferred from BigQuery source to PostgreSQL sink successfully with supported datatypes
22+
Given Open Datafusion Project to configure pipeline
23+
When Expand Plugin group in the LHS plugins list: "Source"
24+
When Select plugin: "BigQuery" from the plugins list as: "Source"
25+
When Expand Plugin group in the LHS plugins list: "Sink"
26+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
27+
Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
28+
Then Navigate to the properties page of plugin: "BigQuery"
29+
Then Replace input plugin property: "project" with value: "projectId"
30+
Then Enter input plugin property: "datasetProject" with value: "projectId"
31+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
32+
Then Enter input plugin property: "dataset" with value: "dataset"
33+
Then Enter input plugin property: "table" with value: "bqSourceTable"
34+
Then Click on the Get Schema button
35+
Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema"
36+
Then Validate "BigQuery" plugin properties
37+
Then Close the Plugin Properties page
38+
Then Navigate to the properties page of plugin: "PostgreSQL"
39+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
40+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
41+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
42+
Then Replace input plugin property: "database" with value: "databaseName"
43+
Then Replace input plugin property: "tableName" with value: "targetTable"
44+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
45+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
46+
Then Enter input plugin property: "referenceName" with value: "targetRef"
47+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
48+
Then Validate "PostgreSQL" plugin properties
49+
Then Close the Plugin Properties page
50+
Then Save the pipeline
51+
Then Preview and run the pipeline
52+
Then Verify the preview of pipeline is "success"
53+
Then Click on preview data for PostgreSQL sink
54+
Then Close the preview data
55+
Then Deploy the pipeline
56+
Then Run the Pipeline in Runtime
57+
Then Wait till pipeline is in running state
58+
Then Open and capture logs
59+
Then Verify the pipeline status is "Succeeded"
60+
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
61+
62+
@BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @Plugin-1526
63+
Scenario: To verify data is getting transferred from BigQuery source to PostgreSQL sink successfully when connection arguments are set
64+
Given Open Datafusion Project to configure pipeline
65+
When Expand Plugin group in the LHS plugins list: "Source"
66+
When Select plugin: "BigQuery" from the plugins list as: "Source"
67+
When Expand Plugin group in the LHS plugins list: "Sink"
68+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
69+
Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
70+
Then Navigate to the properties page of plugin: "BigQuery"
71+
Then Replace input plugin property: "project" with value: "projectId"
72+
Then Enter input plugin property: "datasetProject" with value: "projectId"
73+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
74+
Then Enter input plugin property: "dataset" with value: "dataset"
75+
Then Enter input plugin property: "table" with value: "bqSourceTable"
76+
Then Click on the Get Schema button
77+
Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema"
78+
Then Validate "BigQuery" plugin properties
79+
Then Close the Plugin Properties page
80+
Then Navigate to the properties page of plugin: "PostgreSQL"
81+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
82+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
83+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
84+
Then Replace input plugin property: "database" with value: "databaseName"
85+
Then Replace input plugin property: "tableName" with value: "targetTable"
86+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
87+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
88+
Then Enter key value pairs for plugin property: "connectionArguments" with values from json: "connectionArgumentsList"
89+
Then Enter input plugin property: "referenceName" with value: "targetRef"
90+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
91+
Then Validate "PostgreSQL" plugin properties
92+
Then Close the Plugin Properties page
93+
Then Save the pipeline
94+
Then Preview and run the pipeline
95+
Then Verify the preview of pipeline is "success"
96+
Then Click on preview data for PostgreSQL sink
97+
Then Close the preview data
98+
Then Deploy the pipeline
99+
Then Run the Pipeline in Runtime
100+
Then Wait till pipeline is in running state
101+
Then Open and capture logs
102+
Then Verify the pipeline status is "Succeeded"
103+
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table
104+
105+
@BQ_SOURCE_TEST @Postgresql_Required @POSTGRESQL_TEST_TABLE @Plugin-1526
106+
Scenario: To verify data is getting transferred from BigQuery source to PostgreSQL sink with Advanced property Connection timeout
107+
Given Open Datafusion Project to configure pipeline
108+
When Expand Plugin group in the LHS plugins list: "Source"
109+
When Select plugin: "BigQuery" from the plugins list as: "Source"
110+
When Expand Plugin group in the LHS plugins list: "Sink"
111+
When Select plugin: "PostgreSQL" from the plugins list as: "Sink"
112+
Then Connect plugins: "BigQuery" and "PostgreSQL" to establish connection
113+
Then Navigate to the properties page of plugin: "BigQuery"
114+
Then Replace input plugin property: "project" with value: "projectId"
115+
Then Enter input plugin property: "datasetProject" with value: "projectId"
116+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
117+
Then Enter input plugin property: "dataset" with value: "dataset"
118+
Then Enter input plugin property: "table" with value: "bqSourceTable"
119+
Then Click on the Get Schema button
120+
Then Verify the Output Schema matches the Expected Schema: "bqOutputMultipleDatatypesSchema"
121+
Then Validate "BigQuery" plugin properties
122+
Then Close the Plugin Properties page
123+
Then Navigate to the properties page of plugin: "PostgreSQL"
124+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
125+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
126+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
127+
Then Replace input plugin property: "database" with value: "databaseName"
128+
Then Replace input plugin property: "tableName" with value: "targetTable"
129+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
130+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
131+
Then Enter input plugin property: "referenceName" with value: "targetRef"
132+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
133+
Then Replace input plugin property: "connectionTimeout" with value: "connectionTimeout"
134+
Then Validate "PostgreSQL" plugin properties
135+
Then Close the Plugin Properties page
136+
Then Save the pipeline
137+
Then Preview and run the pipeline
138+
Then Verify the preview of pipeline is "success"
139+
Then Click on preview data for PostgreSQL sink
140+
Then Close the preview data
141+
Then Deploy the pipeline
142+
Then Run the Pipeline in Runtime
143+
Then Wait till pipeline is in running state
144+
Then Open and capture logs
145+
Then Verify the pipeline status is "Succeeded"
146+
Then Validate the values of records transferred to target PostgreSQL table is equal to the values from source BigQuery table

0 commit comments

Comments
 (0)