You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[SPARK-7511] [MLLIB] pyspark ml seed param should be random by default or 42 is quite funny but not very random
Author: Holden Karau <[email protected]>
Closesapache#6139 from holdenk/SPARK-7511-pyspark-ml-seed-param-should-be-random-by-default-or-42-is-quite-funny-but-not-very-random and squashes the following commits:
591f8e5 [Holden Karau] specify old seed for doc tests
2470004 [Holden Karau] Fix a bunch of seeds with default values to have None as the default which will then result in using the hash of the class name
cbad96d [Holden Karau] Add the setParams function that is used in the real code
423b8d7 [Holden Karau] Switch the test code to behave slightly more like production code. also don't check the param map value only check for key existence
140d25d [Holden Karau] remove extra space
926165a [Holden Karau] Add some missing newlines for pep8 style
8616751 [Holden Karau] merge in master
58532e6 [Holden Karau] its the __name__ method, also treat None values as not set
56ef24a [Holden Karau] fix test and regenerate base
afdaa5c [Holden Karau] make sure different classes have different results
68eb528 [Holden Karau] switch default seed to hash of type of self
89c4611 [Holden Karau] Merge branch 'master' into SPARK-7511-pyspark-ml-seed-param-should-be-random-by-default-or-42-is-quite-funny-but-not-very-random
31cd96f [Holden Karau] specify the seed to randomforestregressor test
e1b947f [Holden Karau] Style fixes
ce90ec8 [Holden Karau] merge in master
bcdf3c9 [Holden Karau] update docstring seeds to none and some other default seeds from 42
65eba21 [Holden Karau] pep8 fixes
0e3797e [Holden Karau] Make seed default to random in more places
213a543 [Holden Karau] Simplify the generated code to only include set default if there is a default rather than having None is note None in the generated code
1ff17c2 [Holden Karau] Make the seed random for HasSeed in python
#: param for Column name for predicted class conditional probabilities. Note: Not all models output well-calibrated probability estimates! These probabilities should be treated as confidences, not precise probabilities.
179
172
self.probabilityCol=Param(self, "probabilityCol", "Column name for predicted class conditional probabilities. Note: Not all models output well-calibrated probability estimates! These probabilities should be treated as confidences, not precise probabilities.")
180
-
if'probability'isnotNone:
181
-
self._setDefault(probabilityCol='probability')
173
+
self._setDefault(probabilityCol='probability')
182
174
183
175
defsetProbabilityCol(self, value):
184
176
"""
@@ -206,8 +198,7 @@ def __init__(self):
206
198
super(HasRawPredictionCol, self).__init__()
207
199
#: param for raw prediction (a.k.a. confidence) column name
#: param for the convergence tolerance for iterative algorithms
411
391
self.tol=Param(self, "tol", "the convergence tolerance for iterative algorithms")
412
-
ifNoneisnotNone:
413
-
self._setDefault(tol=None)
414
392
415
393
defsetTol(self, value):
416
394
"""
@@ -438,8 +416,6 @@ def __init__(self):
438
416
super(HasStepSize, self).__init__()
439
417
#: param for Step size to be used for each iteration of optimization.
440
418
self.stepSize=Param(self, "stepSize", "Step size to be used for each iteration of optimization.")
441
-
ifNoneisnotNone:
442
-
self._setDefault(stepSize=None)
443
419
444
420
defsetStepSize(self, value):
445
421
"""
@@ -467,6 +443,7 @@ class DecisionTreeParams(Params):
467
443
minInfoGain=Param(Params._dummy(), "minInfoGain", "Minimum information gain for a split to be considered at a tree node.")
468
444
maxMemoryInMB=Param(Params._dummy(), "maxMemoryInMB", "Maximum memory in MB allocated to histogram aggregation.")
469
445
cacheNodeIds=Param(Params._dummy(), "cacheNodeIds", "If false, the algorithm will pass trees to executors to match instances with nodes. If true, the algorithm will cache node IDs for each instance. Caching can speed up training of deeper trees.")
0 commit comments