...
Apache Ambari can only be used to add CDAP to an existing Hadoop cluster, one that already has the required services (Hadoop: HDFS, YARN, HBase, ZooKeeper, and—optionally—Hive and Spark) installed.
Ambari is for setting up HDP (Hortonworks Data Platform) on bare clusters; it can't be used for clusters with HDP already installed, where the original installation was not with Ambari.
A number of features are currently planned to be added, including:
select CDAP metrics and
a full smoke test of CDAP functionality after installation.
If you are installing CDAP with the intention of using replication, see these instructions on CDAP Replication before installing or starting CDAP.
Advanced Topics
Enabling Security
...
...
...
Enabling Hive Execution Engines
...
Enabling Security
Cask Data Application Platform (CDAP) supports securing clusters using perimeter security, authorization, impersonation and secure storage.
...
For instructions on enabling CDAP Security, see CDAP Security.
CDAP Security is Security is configured by setting the appropriate settings under Ambari for your environment.
...
Enabling CDAP HA
In addition to having a cluster architecture that a cluster architecture that supports HA (high availability), these additional configuration steps need to be followed and completed:
...
CDAP Explore has support for additional execution engines such as Apache Spark and Apache Tez. Details For details on specifying these engines and configuring CDAP are in the Developer Manual section on Date Exploration, , see Hive Execution Engines.
Enabling Spark2
In order to use Spark2, you must first install Spark2 on your cluster. If both Spark1 and Spark2 are installed, you must modify cdap-env to set SPARK_MAJOR_VERSION and SPARK_HOME:
...