Skip to content

Commit 2a41b9b

Browse files
authored
Merge pull request #1073 from alan-turing-institute/dev
For a 0.20.2 release
2 parents 6e45c5d + 4c021a9 commit 2a41b9b

File tree

4 files changed

+33
-28
lines changed

4 files changed

+33
-28
lines changed

Project.toml

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
name = "MLJ"
22
uuid = "add582a8-e3ab-11e8-2d5e-e98b27df1bc7"
33
authors = ["Anthony D. Blaom <[email protected]>"]
4-
version = "0.20.1"
4+
version = "0.20.2"
55

66
[deps]
77
CategoricalArrays = "324d7699-5711-5eae-9e2f-1d82baa6b597"
@@ -31,10 +31,10 @@ Tables = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"
3131
CategoricalArrays = "0.8,0.9, 0.10"
3232
ComputationalResources = "0.3"
3333
Distributions = "0.21,0.22,0.23, 0.24, 0.25"
34-
MLJBase = "1"
3534
MLJBalancing = "0.1"
35+
MLJBase = "1"
3636
MLJEnsembles = "0.4"
37-
MLJFlow = "0.2"
37+
MLJFlow = "0.3"
3838
MLJIteration = "0.6"
3939
MLJModels = "0.16"
4040
MLJTuning = "0.8"
@@ -43,6 +43,7 @@ ProgressMeter = "1.1"
4343
Reexport = "1.2"
4444
ScientificTypes = "3"
4545
StatisticalMeasures = "0.1"
46+
Statistics = "1"
4647
StatsBase = "0.32,0.33, 0.34"
4748
Tables = "0.2,1.0"
4849
julia = "1.6"

docs/src/adding_models_for_general_use.md

Lines changed: 25 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1209,35 +1209,39 @@ Your document string must include the following components, in order:
12091209
Unsupervised models implement the MLJ model interface in a very
12101210
similar fashion. The main differences are:
12111211

1212-
- The `fit` method has only one training argument `X`, as in `MLJModelInterface.fit(model,
1213-
verbosity, X)`. However, it has the same return value `(fitresult, cache, report)`. An
1214-
`update` method (e.g., for iterative models) can be optionally implemented in the same
1215-
way. For models that subtype `Static <: Unsupervised` (see also [Static
1216-
transformers](@ref) `fit` has no training arguments but does not need to be implemented
1217-
as a fallback returns `(nothing, nothing, nothing)`.
1218-
1219-
- A `transform` method is compulsory and has the same signature as
1220-
`predict`, as in `MLJModelInterface.transform(model, fitresult, Xnew)`.
1212+
- The `fit` method, which still returns `(fitresult, cache, report)` will typically have
1213+
only one training argument `X`, as in `MLJModelInterface.fit(model, verbosity, X)`,
1214+
although this is not a hard requirement. For example, a feature selection tool (wrapping
1215+
some supervised model) might also include a target `y` as input. Furthermore, in the
1216+
case of models that subtype `Static <: Unsupervised` (see also [Static
1217+
transformers](@ref) `fit` has no training arguments at all, but does not need to be
1218+
implemented as a fallback returns `(nothing, nothing, nothing)`.
1219+
1220+
- A `transform` and/or `predict` method is implemented, and has the same signature as
1221+
`predict` does in the supervised case, as in `MLJModelInterface.transform(model,
1222+
fitresult, Xnew)`. However, it may only have one data argument `Xnew`, unless `model <:
1223+
Static`, in which case there is no restriction. A use-case for `predict` is K-means
1224+
`MLJModelInterface.predict(model, fitresult, Xnew)`. A use-case is
1225+
clustering that `predict`s labels and `transform`s
1226+
input features into a space of lower dimension. See [Transformers
1227+
that also predict](@ref) for an example.
12211228

1222-
- Instead of defining the `target_scitype` trait, one declares an
1223-
`output_scitype` trait (see above for the meaning).
1229+
- The `target_scitype` trait continues to refer to the output of `predict`, if
1230+
implemented, while a trait, `output_scitype`, is for the output of `transform`.
12241231

12251232
- An `inverse_transform` can be optionally implemented. The signature
12261233
is the same as `transform`, as in
12271234
`MLJModelInterface.inverse_transform(model, fitresult, Xout)`, which:
1228-
12291235
- must make sense for any `Xout` for which `scitype(Xout) <:
1230-
output_scitype(SomeSupervisedModel)` (see below); and
1231-
1236+
output_scitype(SomeSupervisedModel)` (see below); and
12321237
- must return an object `Xin` satisfying `scitype(Xin) <:
1233-
input_scitype(SomeSupervisedModel)`.
1238+
input_scitype(SomeSupervisedModel)`.
1239+
1240+
For sample implementatations, see MLJ's [built-in
1241+
transformers](https://github.com/JuliaAI/MLJModels.jl/blob/dev/src/builtins/Transformers.jl)
1242+
and the clustering models at
1243+
[MLJClusteringInterface.jl](https://github.com/jbrea/MLJClusteringInterface.jl).
12341244

1235-
- A `predict` method may be optionally implemented, and has the same
1236-
signature as for supervised models, as in
1237-
`MLJModelInterface.predict(model, fitresult, Xnew)`. A use-case is
1238-
clustering algorithms that `predict` labels and `transform` new
1239-
input features into a space of lower dimension. See [Transformers
1240-
that also predict](@ref) for an example.
12411245

12421246
## Static models (models that do not generalize)
12431247

docs/src/tuning_models.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,13 +6,13 @@ advanced keyword options.
66

77
tuning strategy | notes |package to import | package providing the core algorithm
88
----------------|-------|------------------|----------------------------------
9-
[`Grid`](@ref)`(goal=nothing, resolution=10)` | shuffled by default; `goal` is upper bound for number of grid points | MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/FluxML/model-zoo)
10-
[`RandomSearch`](@ref)`(rng=GLOBAL_RNG)` | with customizable priors |MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/FluxML/model-zoo)
9+
[`Grid`](@ref)`(goal=nothing, resolution=10)` | shuffled by default; `goal` is upper bound for number of grid points | MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/JuliaAI/MLJTuning.jl)
10+
[`RandomSearch`](@ref)`(rng=GLOBAL_RNG)` | with customizable priors |MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/JuliaAI/MLJTuning.jl)
1111
[`LatinHypercube`](@ref)`(rng=GLOBAL_RNG)` | with discrete parameter support | MLJ.jl or MLJTuning.jl | [LatinHypercubeSampling](https://github.com/MrUrq/LatinHypercubeSampling.jl)
1212
`MLJTreeParzenTuning()` | See this [example](https://github.com/IQVIA-ML/TreeParzen.jl/blob/master/docs/examples/simple_mlj_demo/simple_mlj_demo.md) for usage | TreeParzen.jl | [TreeParzen.jl](https://github.com/IQVIA-ML/TreeParzen.jl) (port to Julia of [hyperopt](http://hyperopt.github.io/hyperopt/))
1313
`ParticleSwarm(n_particles=3, rng=GLOBAL_RNG)` | Standard Kennedy-Eberhart algorithm, plus discrete parameter support | MLJParticleSwarmOptimization.jl | [MLJParticleSwarmOptimization.jl](https://github.com/JuliaAI/MLJParticleSwarmOptimization.jl/)
1414
`AdaptiveParticleSwarm(n_particles=3, rng=GLOBAL_RNG)` | Zhan et al. variant with automated swarm coefficient updates, plus discrete parameter support | MLJParticleSwarmOptimization.jl | [MLJParticleSwarmOptimization.jl](https://github.com/JuliaAI/MLJParticleSwarmOptimization.jl/)
15-
`Explicit()` | For an [explicit list](@ref explicit) of models of varying type | MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/FluxML/model-zoo)
15+
`Explicit()` | For an [explicit list](@ref explicit) of models of varying type | MLJ.jl or MLJTuning.jl | [MLJTuning.jl](https://github.com/JuliaAI/MLJTuning.jl)
1616

1717
Below we illustrate hyperparameter optimization using the
1818
[`Grid`](@ref), [`RandomSearch`](@ref), [`LatinHypercube`](@ref) and

test/exported_names.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Save()
3535

3636
# MLJFlow
3737

38-
MLFlowLogger
38+
MLJFlow.Logger
3939

4040
# StatisticalMeasures
4141

0 commit comments

Comments
 (0)