Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Apache Ambari can only be used to add CDAP to an existing Hadoop cluster, one that already has the required services (Hadoop: HDFS, YARN, HBase, ZooKeeper, and—optionally—Hive and, optionally, Hive and Spark) installed.

  • Ambari is for setting up HDP (Hortonworks Data Platform) on bare clusters; it can't be used for clusters with HDP already installed, where the original installation was not with Ambari.

  • A number of features are currently planned to be added, including:

  • If you are installing CDAP with the intention of using replication, see these instructions on CDAP Replication before installing or starting CDAP.

...

When CDAP starts up, it detects the spark version and uploads the corresponding pipeline system artifact. If you have already started CDAP with Spark1, you will also need to delete the pipeline system artifacts, then reload them in order to use the spark2 versions. After CDAP has been restarted with Spark2, use the RESTful APIMicroservices:

Code Block
$ DELETE /v3/namespaces/system/artifacts/cdap-data-pipeline/versions/6.2.0
$ DELETE /v3/namespaces/system/artifacts/cdap-data-streams/versions/6.2.0
$ POST /v3/namespaces/system/artifacts

...