Let's say I have a BigQuery table with fields f1, f2 and f3, out of which f1 and f2 are nullable, and f3 is not.
I should be able to have a BigQuery sink in my pipeline that updates this table, but only has the field f3 in its input schema. Validation should not fail if optional fields are absent in the schema, and the pipeline should succeed at execution time also.
tested in cdap sandbox 6.3 6.2 and 6.1,
The target table had three columns Identifier, First_name,Last-name(optional)
File source avro with output schema with only Identifier and First_name
Press validate in BQ sink and it was successful
Run those pipelines and it worked fine
Was bigquery table already created on Bigquery side while reproducing the issue?
Also, was the operation on bigquery sink “Update“ OR “Upsert“?
The bigquery table was already created on bigquery side.
We tested this operation with both Update and Upsert and we verified that both operations are working fine.
Tested in cdap sandbox 6.3.0, 6.1.4 and 6.2.3