Skip to content

[Bug] Log transform not applied when computing posterior in heteroskedastic GP model #861

Closed
@btlorch

Description

@btlorch

The heteroskedastic GP model uses two GPs internally: The first models the observation noise, and the second estimates the function value. As far as I understand, the noise model output is in log domain, such that after exponentiation this output will be positive.

Therefore, I expected to find something like torch.exp to undo the log transformation before the noise estimate is combined with the second GP. The only transformation I found, however, comes from the HeteroskedasticNoise's noise constraint GreaterThan(1e-4), which applies a softplus transform. For negative inputs (< -2) the softplus function yields similar values as the exp function. But in case of larger noise, softplus and exp differ significantly. As a result, the noise is underestimated.

Have I missed the piece of code which untransforms the noise from the log domain, or is it the softplus function in the noise constraint? In case of the latter, shouldn't the noise constraint use an exp transform instead of softplus?

If this is the case, I would propose to change this line to

heteroskedastic_noise = HeteroskedasticNoise(
    noise_model=noise_model,
    noise_constraint=GreaterThan(MIN_INFERRED_NOISE_LEVEL, transform=torch.exp, inv_transform=torch.log),
)
likelihood = _GaussianLikelihoodBase(heteroskedastic_noise)

System Info

  • BoTorch Version 0.4.0
  • GPyTorch Version 1.4.2
  • PyTorch Version 1.7.1
  • Ubuntu 20.04.2 LTS

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions