Skip to content

Commit 02170ed

Browse files
authored
Fix advanced usage guide (#1785)
1 parent 25a32ef commit 02170ed

File tree

1 file changed

+21
-32
lines changed

1 file changed

+21
-32
lines changed

docs/src/using-turing/advanced.md

Lines changed: 21 additions & 32 deletions
Original file line numberDiff line numberDiff line change
@@ -20,8 +20,7 @@ First, define a type of the distribution, as a subtype of a corresponding distri
2020

2121

2222
```julia
23-
struct CustomUniform <: ContinuousUnivariateDistribution
24-
end
23+
struct CustomUniform <: ContinuousUnivariateDistribution end
2524
```
2625

2726
### 2. Implement Sampling and Evaluation of the log-pdf
@@ -31,8 +30,11 @@ Second, define `rand` and `logpdf`, which will be used to run the model.
3130

3231

3332
```julia
34-
Distributions.rand(rng::AbstractRNG, d::CustomUniform) = rand(rng) # sample in [0, 1]
35-
Distributions.logpdf(d::CustomUniform, x::Real) = zero(x) # p(x) = 1 → logp(x) = 0
33+
# sample in [0, 1]
34+
Distributions.rand(rng::AbstractRNG, d::CustomUniform) = rand(rng)
35+
36+
# p(x) = 1 → logp(x) = 0
37+
Distributions.logpdf(d::CustomUniform, x::Real) = zero(x)
3638
```
3739

3840
### 3. Define Helper Functions
@@ -72,20 +74,10 @@ and then we can call `rand` and `logpdf` as usual, where
7274

7375
To read more about Bijectors.jl, check out [the project README](https://github.com/TuringLang/Bijectors.jl).
7476

75-
#### 3.2 Vectorization Support
76-
77-
78-
The vectorization syntax follows `rv ~ [distribution]`, which requires `rand` and `logpdf` to be called on multiple data points at once. An appropriate implementation for `Flat` is shown below.
79-
80-
81-
```julia
82-
Distributions.logpdf(d::Flat, x::AbstractVector{<:Real}) = zero(x)
83-
```
84-
8577
## Update the accumulated log probability in the model definition
8678

8779
Turing accumulates log probabilities internally in an internal data structure that is accessible through
88-
the internal variable `_varinfo` inside of the model definition (see below for more details about model internals).
80+
the internal variable `__varinfo__` inside of the model definition (see below for more details about model internals).
8981
However, since users should not have to deal with internal data structures, a macro `Turing.@addlogprob!` is provided
9082
that increases the accumulated log probability. For instance, this allows you to
9183
[include arbitrary terms in the likelihood](https://github.com/TuringLang/Turing.jl/issues/1332)
@@ -123,19 +115,21 @@ end
123115
Note that `@addlogprob!` always increases the accumulated log probability, regardless of the provided
124116
sampling context. For instance, if you do not want to apply `Turing.@addlogprob!` when evaluating the
125117
prior of your model but only when computing the log likelihood and the log joint probability, then you
126-
should [check the type of the internal variable `_context`](https://github.com/TuringLang/DynamicPPL.jl/issues/154)
118+
should [check the type of the internal variable `__context_`](https://github.com/TuringLang/DynamicPPL.jl/issues/154)
127119
such as
128120

129121
```julia
130-
if !isa(_context, Turing.PriorContext)
122+
if DynamicPPL.leafcontext(__context__) !== Turing.PriorContext()
131123
Turing.@addlogprob! myloglikelihood(x, μ)
132124
end
133125
```
134126

135127
## Model Internals
136128

137129

138-
The `@model` macro accepts a function definition and rewrites it such that call of the function generates a `Model` struct for use by the sampler. Models can be constructed by hand without the use of a macro. Taking the `gdemo` model as an example, the macro-based definition
130+
The `@model` macro accepts a function definition and rewrites it such that call of the function generates a `Model` struct for use by the sampler.
131+
Models can be constructed by hand without the use of a macro.
132+
Taking the `gdemo` model as an example, the macro-based definition
139133

140134
```julia
141135
using Turing
@@ -152,41 +146,36 @@ end
152146
model = gdemo([1.5, 2.0])
153147
```
154148

155-
is equivalent to the macro-free version
149+
can be implemented also (a bit less generally) with the macro-free version
156150

157151
```julia
158152
using Turing
159153

160154
# Create the model function.
161-
function modelf(rng, model, varinfo, sampler, context, x)
162-
# Assume s has an InverseGamma distribution.
163-
= Turing.DynamicPPL.tilde_assume(
164-
rng,
155+
function gdemo(model, varinfo, context, x)
156+
# Assume s² has an InverseGamma distribution.
157+
s², varinfo = DynamicPPL.tilde_assume!!(
165158
context,
166-
sampler,
167159
InverseGamma(2, 3),
168-
Turing.@varname(s),
169-
(),
160+
Turing.@varname(s²),
170161
varinfo,
171162
)
172163

173164
# Assume m has a Normal distribution.
174-
m = Turing.DynamicPPL.tilde_assume(
175-
rng,
165+
m, varinfo = DynamicPPL.tilde_assume!!(
176166
context,
177-
sampler,
178167
Normal(0, sqrt(s²)),
179168
Turing.@varname(m),
180-
(),
181169
varinfo,
182170
)
183171

184172
# Observe each value of x[i] according to a Normal distribution.
185-
Turing.DynamicPPL.dot_tilde_observe(context, sampler, Normal(m, sqrt(s²)), x, varinfo)
173+
DynamicPPL.dot_tilde_observe!!(context, Normal(m, sqrt(s²)), x, Turing.@varname(x), varinfo)
186174
end
175+
gdemo(x) = Turing.Model(:gdemo, gdemo, (; x))
187176

188177
# Instantiate a Model object with our data variables.
189-
model = Turing.Model(modelf, (x = [1.5, 2.0],))
178+
model = gdemo([1.5, 2.0])
190179
```
191180

192181
## Task Copying

0 commit comments

Comments
 (0)