Skip to content

[SPARK-4161]Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf #3050

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from

Conversation

witgo
Copy link
Contributor

@witgo witgo commented Nov 1, 2014

No description provided.

@witgo witgo changed the title Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf [SPARK-4161]Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf Nov 1, 2014
@SparkQA
Copy link

SparkQA commented Nov 1, 2014

Test build #22699 has started for PR 3050 at commit 38890ab.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Nov 1, 2014

Test build #22699 has finished for PR 3050 at commit 38890ab.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/22699/
Test PASSed.

@SparkQA
Copy link

SparkQA commented Dec 2, 2014

Test build #24052 has started for PR 3050 at commit 38890ab.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Dec 2, 2014

Test build #24052 has finished for PR 3050 at commit 38890ab.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24052/
Test PASSed.

@JoshRosen
Copy link
Contributor

If this fix is spark-shell-specific, then I think it's clearer to add to OUR_JAVA_OPTS in the case statement at the start of this file. There should also be a comment that references the JIRA to explain why we need that option.

@SparkQA
Copy link

SparkQA commented Dec 5, 2014

Test build #24174 has started for PR 3050 at commit 4f0f572.

  • This patch merges cleanly.

@witgo
Copy link
Contributor Author

witgo commented Dec 5, 2014

@JoshRosen The code has been updated.

@SparkQA
Copy link

SparkQA commented Dec 5, 2014

Test build #24174 has finished for PR 3050 at commit 4f0f572.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24174/
Test PASSed.

@JoshRosen
Copy link
Contributor

Ping @andrewor14, do you think this is the right place to put this line?

@@ -109,6 +109,9 @@ else
fi
JAVA_VERSION=$("$RUNNER" -version 2>&1 | grep 'version' | sed 's/.* version "\(.*\)\.\(.*\)\..*"/\1\2/; 1q')

# SPARK-3936: scala does not assume use of the java classpath, so we need to add the "-Dscala.usejavacp=true"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is this the right JIRA? This seems to refer to a GraphX issue.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, good catch; I think this should be SPARK-4161.

@andrewor14
Copy link
Contributor

@witgo I think this change looks fine, though I don't have the sufficient expertise on how Scala sets up its class paths to decide what implications this might have. I even think that we should probably do it in bin/spark-shell itself because this is specific to the shell, not bin/spark-class in general. Could you add it there instead and test it to see if it works?

I suppose this is better than manually adding it in SparkILoop because now we have to worry about the different scala versions, and the code there is not at all trivial to tear apart.

@witgo
Copy link
Contributor Author

witgo commented Dec 10, 2014

OK, I'll try. But CoarseMesosSchedulerBackend.scala#L156 is also dependent on the opt.

@SparkQA
Copy link

SparkQA commented Dec 10, 2014

Test build #24291 has started for PR 3050 at commit abb6fa4.

  • This patch merges cleanly.

@witgo
Copy link
Contributor Author

witgo commented Dec 10, 2014

In my local test, it works.

@SparkQA
Copy link

SparkQA commented Dec 10, 2014

Test build #24291 has finished for PR 3050 at commit abb6fa4.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@AmplabJenkins
Copy link

Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/24291/
Test PASSed.

@andrewor14
Copy link
Contributor

Ok, LGTM I'll merge this into master and wait to backport it later into branch-1.2.

@asfgit asfgit closed this in 742e709 Dec 10, 2014
@witgo witgo deleted the SPARK-4161 branch December 11, 2014 01:42
asfgit pushed a commit that referenced this pull request Jan 21, 2015
…ver.extraClassPath" is set in defaults.conf

Author: GuoQiang Li <[email protected]>

Closes #3050 from witgo/SPARK-4161 and squashes the following commits:

abb6fa4 [GuoQiang Li] move usejavacp opt to spark-shell
89e39e7 [GuoQiang Li] review commit
c2a6f04 [GuoQiang Li] Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf
asfgit pushed a commit that referenced this pull request Jan 21, 2015
…ver.extraClassPath" is set in defaults.conf

Author: GuoQiang Li <[email protected]>

Closes #3050 from witgo/SPARK-4161 and squashes the following commits:

abb6fa4 [GuoQiang Li] move usejavacp opt to spark-shell
89e39e7 [GuoQiang Li] review commit
c2a6f04 [GuoQiang Li] Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants