Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Code Block
// Add the artifact for a Data Pipeline app
addAppArtifact(new ArtifactId(NamespaceId.DEFAULT.getNamespace(), "data-pipeline", "3.5.0"),
  DataPipelineApp.class,
  BatchSource.class.getPackage().getName(),
  Action.class.getPackage().getName(),
  PipelineConfigurable.class.getPackage().getName(),
  "org.apache.avro.mapred", "org.apache.avro", "org.apache.avro.generic");

The first argument is the id of the artifact; the second is the application class; and the remainder of the arguments are packages that should be included in the Export-Packages manifest attribute bundled in the JAR. The framework will trace the dependencies of the specified application class to create a JAR with those dependencies. This will mimic what happens when you actually build your application JAR using maven.

...

Code Block
// Create application create request
ETLBatchConfig etlConfig = new ETLBatchConfig("* * * * *", source, sink, transformList);
AppRequest<ETLBatchConfig> appRequest = new AppRequest<>(new ArtifactSummary("etlbatch", "3.5.0"), etlConfig);
ApplicationId appId = NamespaceId.DEFAULT.app("KVToKV");

// Deploy the application
ApplicationManager appManager = deployApplication(appId, appRequest);

Plugins extending the artifact can also be added:

Code Block
// Add some test plugins
addPluginArtifact(new ArtifactId(NamespaceId.DEFAULT.getNamespace(), "spark-plugins", "1.0.0"),
                  APP_ARTIFACT_ID,
                  NaiveBayesTrainer.class, NaiveBayesClassifier.class);

The first argument is the id of the plugin artifact; the second is the parent artifact it is extending; and the remainder of the arguments are classes that should be bundled in the JAR. The packages of all these classes are included in the Export-Packages manifest attribute bundled in the JAR. When adding a plugin artifact this way, it is important to include all classes in your plugin packages, even if they are not used in your test case. This is to ensure that the JAR can trace all required dependencies to correctly build the JAR.

...

Additional information on unit testing with CDAP is in the Developer Manual section on Testing a CDAP Application.

In addition, CDAP provides a hydrator-test module that contains several mock plugins for you to use in tests with your custom plugins. To use the module, add a dependency to your pom.xml:

...