Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

CDAP-18571: Fixed an issue where messages couldn’t could not be retrieved for Kafka topics. This broke in 6.5.0 and is now fixed in 6.5.1.

CDAP-18538, CDAP-184254: Fixed an issue where you couldn’t could not create a profile for an existing Dataproc cluster.

...

CDAP-18439: Fixed an issue in Replication that caused the Configure button to result in caused an error when you clicked itclicked Configure.

CDAP-18428: Fixed an issue that caused pipelines to fail with an Access Denied error when the pipeline had BigQuery plugins or Transformation Pushdown configuration that included a Dataset Project ID that was in a different project than the specified Project ID:

The Access Denied error was due to missing permissions on the service account. 

To ensure pipelines with BigQuery or BigQuery Multi Table sinks and pipelines with Transformation Pushdown enabled run successfully, assign the following roles to the Project ID service account:

  • BigQuery Job User role to run jobs

  • GCE Storage Bucket Admin role to create a temporary bucket

If the dataset is not in the same project that the BigQuery job will run in, the Dataset Project ID service account must be granted the following role to write data to a BigQuery dataset or table:

...

Example error in the logs (might differ depending on the plugin you are using):

Code Block
POST https://bigquery.googleapis.com/bigquery/v2/projects/PROJECT_ID/jobs
{
"code" : 403,
"errors" : [ {
"domain" : "global",
"message" : "Access Denied: Project xxxx: User does not have bigquery.jobs.create permission in project PROJECT_ID",
"reason" : "accessDenied"
} ],
"message" : "Access Denied: Project PROJECT_ID: User does not have bigquery.jobs.create permission in project PROJECT_ID.",
"status" : "PERMISSION_DENIED"
}

In this example, PROJECT_ID is the Project ID that you specified in the plugin. The service account for the project specified in the plugin does not have permission to do at least one of the following:

  • Run a BigQuery job

  • Read a BigQuery dataset

  • Create a temporary bucket

  • Create a BigQuery dataset

  • Create the BigQuery table

CDAP-18423: Fixed an issue in the GCS connection that prevented browsing and parsing files stored in folders under buckets.

...