Assigned
Status Update
Comments
va...@google.com <va...@google.com>
ar...@google.com <ar...@google.com> #2
Hi , Could you please clarify the issue description or share any screen shot of the problem where you are facing issue ?
Description
a)
Pass multiple jars and files as one wild card path when "gcloud dataproc jobs submit spark"
Cx would like to have the option to have is a central directory for all jars and files when creating a Dataproc cluster to have better readability. Currently passing this in seems to be too cluttered. All the jars are in the same bucket
According to our public documentation [1], jars could be passed in as a list --jars=[JAR,…]. Files should work the same way. There is currently no working directory in which all of them could be in one place.
b)
Please update documentation for \ --id for "gcloud dataproc jobs submit spark" under optional flags [2]
--id=<User defined unique ID for the job being submitted>
[1]
[2]