Skip to content

[Feature Request] Make torchopt.optim.Optimizer compatible with pytorch lightning #203

@SamDuffield

Description

@SamDuffield

Required prerequisites

Motivation

Currently torchopt.optim classes aren't compatible with lightning's configure_optimizers.

This is because lightning doesn't think they are Optimizable

import torchopt
from lightning.fabric.utilities.types import Optimizable
optimizer = torchopt.Adam(model.parameters(), lr=1e-3)

isinstance(optimizer, Optimizable)
# False

For it to be Optimizable it requires defaults and state attributes.

If simply you do

from collections import defaultdict
optimizer.defaults = {}
optimizer.state = defaultdict()

then isinstance(optimizer, Optimizable) passes and torchopt <> lightning works a charm 😍

Solution

Can we add defaults and state attributes to the torchopt.optim.Optimizer class?

Alternatives

No response

Additional context

No response

Metadata

Metadata

Assignees

Labels

enhancementNew feature or request

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions