Spark job completion status for empty dummy programs

Description

In unit tests we write MR programs and other things with //no-op
If we do the same thing we get the spark program success status as false because to collect program status we use SparkListener which listens on job status. It expects a job to run to report the status as successful or otherwise.

Find out if there is some other way to get job status from spark which can allow us to use //no=op programs in unit tests.

Release Notes

None

Activity

Show:
Rohit Sinha
April 9, 2015, 11:38 AM

We might be able to do this by listening to the callback on spark user program run

Terence Yim
July 8, 2015, 6:27 PM

Fixed as a side effect of https://github.com/caskdata/cdap/pull/3094 since we don't use SparkListener anymore, but rather tracking the status directly through the SparkRuntimeService.

Fixed

Assignee

Terence Yim

Reporter

Rohit Sinha

Labels

None

Docs Impact

None

UX Impact

None

Components

Fix versions

Affects versions

Priority

Major