-
Notifications
You must be signed in to change notification settings - Fork 28.7k
[SPARK-6046] [core] Reorganize deprecated config support in SparkConf. #5514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
This change tries to follow the chosen way for handling deprecated configs in SparkConf: all values (old and new) are kept in the conf object, and newer names take precedence over older ones when retrieving the value. Warnings are logged when config options are set, which generally happens on the driver node (where the logs are most visible).
In the process, deprecate the name that mentions the units now that we can define the unit in the config value itself. Also tweak a couple of messages.
Test build #30282 has finished for PR 5514 at commit
|
@@ -86,11 +86,12 @@ follows: | |||
</td> | |||
</tr> | |||
<tr> | |||
<td>spark.history.fs.update.interval.seconds</td> | |||
<td>10</td> | |||
<td>spark.history.fs.update.interval</td> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
oops, good catch.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, not a catch. I'm adding the new config name (dropping the ".seconds" suffix) since now that we can set the value with units, the suffix looks weird.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good catch in the sense that a previous PR #5236 missed this config. It's bad if we have documented time configs that can be set one way but not the other.
@@ -134,13 +135,16 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging { | |||
|
|||
/** Set multiple parameters together */ | |||
def setAll(settings: Traversable[(String, String)]): SparkConf = { | |||
settings.foreach { case (k, v) => logDeprecationWarning(k) } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not your change, but I wonder if this one should just keep calling set
. This currently doesn't handle null keys or values
This is great. I think this is mergeable as is aside from one functional comment, though I'd like to see the rest of the code style comments addressed as well. |
yeah, let me fix those (especially the seconds vs. ms one). |
Test build #30355 has finished for PR 5514 at commit
|
Test build #30367 has finished for PR 5514 at commit
|
Test build #30366 has finished for PR 5514 at commit
|
LGTM |
This change tries to follow the chosen way for handling deprecated
configs in SparkConf: all values (old and new) are kept in the conf
object, and newer names take precedence over older ones when
retrieving the value.
Warnings are logged when config options are set, which generally happens
on the driver node (where the logs are most visible).