Skip to content

[SPARK-9672][MESOS] Don’t include SPARK_ENV_LOADED when passing env vars #7979

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 4 commits into from

Conversation

pashields
Copy link
Contributor

This contribution is my original work and I license the work to the project under the project's open source license.

@JoshRosen
Copy link
Contributor

/cc @tnachen and @andrewor14 for Mesos-related-patch triage.

@tnachen
Copy link
Contributor

tnachen commented Aug 11, 2015

Can we not filter anything MESOS_ as well? Users set the MESOS_NATIVE_JAVA_LIBRARY in spark_env.sh and this could filter this out

@pashields
Copy link
Contributor Author

Thanks for the review @tnachen

Anything that is in spark-env.sh shouldn't be filtered now that SPARK_ENV_LOADED isn't being sent (since it will be re-sourced on the driver side). I added some code to make sure MESOS_* vars are passed through anyways so that they could be set elsewhere.

@andrewor14
Copy link
Contributor

@tnachen any follow ups on this?

@tnachen
Copy link
Contributor

tnachen commented Sep 1, 2015

This LGTM

@andrewor14
Copy link
Contributor

ok to test

/**
* Filter non-spark environment variables from any environment.
*/
def filterSystemEnvironment(env: Map[String, String]): Map[String, String] = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please make this private

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is a test for it, so I'd need to make it visible to the test somehow or remove the test.

I've changed it to protected[rest]. LMK if you have an alternate solution or would like me to make it private and remove the test.

@SparkQA
Copy link

SparkQA commented Sep 3, 2015

Test build #41959 has finished for PR 7979 at commit 28a34b8.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@pashields
Copy link
Contributor Author

Thanks for the review @andrewor14.

@SparkQA
Copy link

SparkQA commented Sep 3, 2015

Test build #41966 has finished for PR 7979 at commit 539988c.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

/**
* Filter non-spark environment variables from any environment.
*/
protected[rest] def filterSystemEnvironment(env: Map[String, String]): Map[String, String] = {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this can be private[rest], I've fixed that on merge.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for helping to guide this through. Apologies for lack of Scala knowledge.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no worries, hope to see more patches from you!

@asfgit asfgit closed this in e62f4a4 Sep 3, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants