Skip to content

Conversation

witgo
Copy link
Contributor

@witgo witgo commented Jun 28, 2014

No description provided.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16228/

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16229/

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished. All automated tests passed.

@AmplabJenkins
Copy link

All automated tests passed.
Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16267/

}

/** Load any spark.* system properties */
private[spark] def loadSystemProperties(isOverride: Boolean = true) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is only called in one place, why the argument? Seems like we always want system properties to override other settings anyway.

@vanzin
Copy link
Contributor

vanzin commented Jul 8, 2014

I'd like to see better documentation about the semantics of how the different sources for configuration override each other. Also, some tests would be nice.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

@AmplabJenkins
Copy link

Merged build finished.

@AmplabJenkins
Copy link

Refer to this link for build results: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/16465/

@witgo
Copy link
Contributor Author

witgo commented Jul 9, 2014

Jenkins, retest this please.

@AmplabJenkins
Copy link

Merged build triggered.

@AmplabJenkins
Copy link

Merged build started.

val message = s"Failed when loading Spark properties"
throw new SparkException(message, e)
} finally {
inReader.close()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you're keeping this, you should document that this method will close the given input stream.

test("loading from file") {
val outFile = File.createTempFile("sparkConf-loading-from-file", "")
outFile.deleteOnExit()
val outStream: FileWriter = new FileWriter(outFile)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could use this method instead.

@vanzin
Copy link
Contributor

vanzin commented Jul 30, 2014

Still a few nits left, but looks ok to me.

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA tests have started for PR 1256. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17539/consoleFull

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA results for PR 1256:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds the following public classes (experimental):
class SparkConf(loadDefaults: Boolean, fileName: Option[String])

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17539/consoleFull

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA tests have started for PR 1256. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17555/consoleFull

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA results for PR 1256:
- This patch FAILED unit tests.
- This patch merges cleanly
- This patch adds the following public classes (experimental):
class SparkConf(loadDefaults: Boolean, fileName: Option[String])

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17555/consoleFull

@witgo
Copy link
Contributor Author

witgo commented Jul 31, 2014

Jenkins, retest this please.

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA tests have started for PR 1256. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17564/consoleFull

@SparkQA
Copy link

SparkQA commented Jul 31, 2014

QA results for PR 1256:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds the following public classes (experimental):
class SparkConf(loadDefaults: Boolean, fileName: Option[String])

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/17564/consoleFull

@SparkQA
Copy link

SparkQA commented Aug 11, 2014

QA tests have started for PR 1256. This patch merges cleanly.
View progress: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18302/consoleFull

@SparkQA
Copy link

SparkQA commented Aug 11, 2014

QA results for PR 1256:
- This patch PASSES unit tests.
- This patch merges cleanly
- This patch adds the following public classes (experimental):
class SparkConf(loadDefaults: Boolean, fileName: Option[String])

For more information see test ouptut:
https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/18302/consoleFull

@vanzin
Copy link
Contributor

vanzin commented Aug 22, 2014

@pwendell @andrewor14 could you guys take a look at this PR? Thanks!

<td>spark.history.fs.logDirectory</td>
<td>(none)</td>
<td>
Directory where app logs are stored.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about "Directory that contains application event logs to be loaded by the history server"

@andrewor14
Copy link
Contributor

@witgo I like the idea of having all processes (HistoryServer, Master, Worker etc.) read from the properties file in addition just SparkSubmit. However, I don't think this scope should include SparkConf, which is more application-specific. It would be good to keep the existing code in SparkSubmitArguments and only move the part where we actually read properties from the file into Utils, then all the other processes can handle --properties-file the same way. Do you understand what I mean?

@witgo
Copy link
Contributor Author

witgo commented Sep 12, 2014

OK, I generally understand what you mean,I will re-implement the feature at the weekend.

@witgo witgo changed the title SPARK-2098: All Spark processes should support spark-defaults.conf, config file [SPARK-2098] All Spark processes should support spark-defaults.conf, config file Sep 13, 2014
@andrewor14
Copy link
Contributor

@witgo would you mind closing this since you opened another one?

@witgo
Copy link
Contributor Author

witgo commented Sep 13, 2014

OK

@witgo witgo closed this Sep 13, 2014
asfgit pushed a commit that referenced this pull request Oct 15, 2014
…config file

This is another implementation about #1256
cc andrewor14 vanzin

Author: GuoQiang Li <[email protected]>

Closes #2379 from witgo/SPARK-2098-new and squashes the following commits:

4ef1cbd [GuoQiang Li] review commit
49ef70e [GuoQiang Li] Refactor getDefaultPropertiesFile
c45d20c [GuoQiang Li] All Spark processes should support spark-defaults.conf, config file
@witgo witgo deleted the SPARK-2098 branch October 15, 2014 10:22
wangyum pushed a commit that referenced this pull request May 26, 2023
* [CARMEL-6541] Support Query Level SQL Conf leveraging Hint

* fix code style

* fix ut

* Add one more test

Co-authored-by: Chao Sun <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants