Packaging Plugins
To package, present, and deploy your plugin, see these instructions:
Plugin Packaging:Â packaging in a JAR
Plugin Presentation:Â controlling how your plugin appears in the Pipeline Studio
If you are installing a third-party JAR (such as a JDBC driver) to make it accessible to other plugins or applications, see these instructions.
Plugin Dependencies
Plugins need to be packaged together with any dependencies that are not provided by the system. The system provides CDAP dependencies, some core Hadoop dependencies, and some core Spark dependencies.
Any CDAP dependency with api
in the name does not need to be packaged and should be set to the provided scope in your pom. For example, the cdap-api
, cdap-api-spark2_2.11
, cdap-etl-api
, and cdap-etl-api-spark
dependencies should be set to the provided scope.
The hadoop-common
and hadoop-mapreduce-client-core
dependencies should be set to the provided scope.
The spark-core
, spark-streaming
, spark-repl
, and spark-mllib
dependencies should be set to the provided scope.
Any other dependencies in your project should be a compile dependency, meaning they will get packaged in the plugin jar.
Plugin Packaging
A Plugin is packaged as a JAR file, which contains the plugin classes and their dependencies. CDAP uses the "Export-Package" attribute in the JAR file manifest to determine which classes are visible. A visible class is one that can be used by another class that is not from the plugin JAR itself. This means the Java package which the plugin class is in must be listed in "Export-Package", otherwise the plugin class will not be visible, and hence no one will be able to use it. This can be done in Maven by editing your pom.xml. For example, if your plugins are in the com.example.runnable
 and com.example.callable
 packages, you would edit the bundler plugin in your pom.xml:
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<configuration>
<instructions>
<Embed-Dependency>*;inline=false;scope=compile</Embed-Dependency>
<Embed-Transitive>true</Embed-Transitive>
<Embed-Directory>lib</Embed-Directory>
<Export-Package>com.example.runnable;com.example.callable</Export-Package>
</instructions>
</configuration>
...
</plugin>
If you are developing plugins for the cdap-data-pipeline
 artifact, be aware that for classes inside the plugin JAR that you have added to the Hadoop Job configuration directly (for example, your custom InputFormat
 class), you will need to add the Java packages of those classes to the "Export-Package" as well. This is to ensure those classes are visible to the Hadoop MapReduce framework during the plugin execution. Otherwise, the execution will typically fail with a ClassNotFoundException
.
Created in 2020 by Google Inc.