Skip to content

Fix deprecated torch.cuda.amp.GradScaler usage in trainer #3682

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

hsleonis
Copy link

@hsleonis hsleonis commented Aug 6, 2025

Replace deprecated torch.cuda.amp.GradScaler with the new torch.amp.GradScaler API to resolve FutureWarning in PyTorch 2.4+.

Changes:
Updated trainer.py line 545 to use torch.amp.GradScaler('cuda', ...) instead of torch.cuda.amp.GradScaler(...)

Issue:
Fixes deprecation warning:

FutureWarning: `torch.cuda.amp.GradScaler(args...)` is deprecated. 
Please use `torch.amp.GradScaler('cuda', args...)` instead.

Compatibility:

  • Maintains backward compatibility (PyTorch 1.10+ supports both APIs)
  • No functional changes - purely API migration
  • Tested with existing training workflows

Fix deprecation warning in Flair's trainer code. The issue is that: Flair is using the old PyTorch AMP (Automatic Mixed Precision) API.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant