Support concurrent runs of a Spark program

Description

We should be able to support concurrent runs of a Spark.

Currently, we only allow concurrent runs for Workflow and MapReduce:
https://github.com/caskdata/cdap/blob/release/3.4/cdap-app-fabric/src/main/java/co/cask/cdap/internal/app/services/ProgramLifecycleService.java#L449-L452

A workaround is to run the Spark program within a Workflow, which does support concurrent runs.

Relevant documentation that will need updating as a part of this JIRA:
http://docs.cask.co/cdap/3.4.3/en/reference-manual/http-restful-api/lifecycle.html#start-a-program

Release Notes

Feature to support concurrent runs of a Spark program

duplicates

Activity

Show:

Terence YimNovember 10, 2016 at 11:42 PM

Terence YimNovember 10, 2016 at 7:29 PM

It is an artificial restriction in the ProgramLifecycleService, which a single line change would fix it.

Ali AnwarAugust 2, 2016 at 11:57 PM

That JIRA is for supporting more than 1 Spark job at a time (i.e. two different spark programs running concurrently).
This JIRA is about supporting two concurrent runs of the same Spark program.

Rohit SinhaAugust 2, 2016 at 11:45 PM

: See https://issues.cask.co/browse/CDAP-349 We already support this don't we ?

Fixed
Pinned fields
Click on the next to a field label to start pinning.

Details

Assignee

Reporter

Labels

Affects versions

Components

Fix versions

Priority

Created August 2, 2016 at 11:28 PM
Updated December 17, 2016 at 2:52 AM
Resolved November 11, 2016 at 7:02 AM