We're updating the issue view to help you get more done. 

Support concurrent runs of a Spark program

Description

We should be able to support concurrent runs of a Spark.

Currently, we only allow concurrent runs for Workflow and MapReduce:
https://github.com/caskdata/cdap/blob/release/3.4/cdap-app-fabric/src/main/java/co/cask/cdap/internal/app/services/ProgramLifecycleService.java#L449-L452

A workaround is to run the Spark program within a Workflow, which does support concurrent runs.

Relevant documentation that will need updating as a part of this JIRA:
http://docs.cask.co/cdap/3.4.3/en/reference-manual/http-restful-api/lifecycle.html#start-a-program

Release Notes

Feature to support concurrent runs of a Spark program

Activity

Show:
Rohit Sinha
August 2, 2016, 11:45 PM

: See https://issues.cask.co/browse/CDAP-349 We already support this don't we ?

Ali Anwar
August 2, 2016, 11:57 PM

That JIRA is for supporting more than 1 Spark job at a time (i.e. two different spark programs running concurrently).
This JIRA is about supporting two concurrent runs of the same Spark program.

Terence Yim
November 10, 2016, 7:29 PM

It is an artificial restriction in the ProgramLifecycleService, which a single line change would fix it.

Terence Yim
November 10, 2016, 11:42 PM
Fixed

Assignee

Terence Yim

Reporter

Ali Anwar

Labels

Docs Impact

None

UX Impact

None

Components

Fix versions

Affects versions

Priority

Major
Configure