Skip to content

Memory concerns with large l_max values #670

@kenteross

Description

@kenteross

The Issue:

I'm running a model using a logistic saturation transformation followed by a delayed adstock transformation for a PYMC model. I have a burn-in period of 30 days, and my issue appears when I add the code to slice the burn-in period out. After adding the slice the model will run out of RAM even with very few samples. Reversing the order of the transformations (i.e., adstock before saturation) seems to work without running into RAM issues though.

Reproducible code example

Error Message:

KernelInterrupted: Execution interrupted by the Jupyter kernel.

PyMC version information:

The version of PYMC I'm using is '5.13.1'.

Context for the issue:

I think the issue is coming from the convolution in the adstock transformation, but I'm not sure why my code works when I do the adstock transformation before the saturation transformation.

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions