Skip to content

Conversation

@CharlelieLrt
Copy link
Collaborator

@CharlelieLrt CharlelieLrt commented Dec 5, 2025

PhysicsNeMo Pull Request

Description

Restructure Diffusion Subpackage

This PR establishes the foundational structure for the physicsnemo.diffusion subpackage, organizing diffusion-related functionality into a clean, layered architecture.

Summary

  • Created physicsnemo.diffusion package with dedicated submodules:

    • generate/ - Higher-level generation utilities and pipelines
    • samplers/ - Samplers based on ODE and SDE solvers
    • metrics/ - Loss functions and diffusion-specific metrics (FID, etc.)
    • preconditioners/ - Preconditioning model wrappers
    • multi_diffusion/ - Patch-based diffusion utilities
    • denoisers/ - Placeholder for future denoiser implementations; to be used in samplers
    • noise_schedulers/ - Placeholder for future noise scheduler implementations
    • utils/ - Shared utilities
  • Added warning system for incremental migration:

    • FutureFeatureWarning for placeholder modules not yet implemented
    • LegacyFeatureWarning for existing code that will be deprecated in future releases
  • Reorganized UNet models:

    • Moved SongUNet models and application-specific UNet wrappers to physicsnemo.models.diffusion_unets
  • Extracted reusable layers into physicsnemo.nn:
    This includes for example:

    • group_norm.py - GroupNorm variants with optional FP32 conversion
    • unet_layers.py - UNet encoder/decoder blocks, attention blocks
    • embedding_layers.py - Positional and Fourier embeddings
      Etc...
  • Cleaned up legacy code:

    • Deleted unused utility functions from old physicsnemo.models.diffusion/
    • Moved example-specific utilities directly into examples/ directory
  • Added import-linter contracts for the diffusion package:

    • Enforces layered dependency structure: generatesamplers/metricspreconditioners/multi_diffusion/denoisers/noise_schedulersutils
    • Prevents circular imports and maintains clean architecture
  • Updated all examples and tests to use the new package structure

  • I am familiar with the Contributing Guidelines.

  • New or existing tests cover these changes.

  • The documentation is up to date with these changes.

  • The CHANGELOG.md is up to date with these changes.

  • An issue is linked to this pull request.

Dependencies

Review Process

All PRs are reviewed by the PhysicsNeMo team before merging.

Depending on which files are changed, GitHub may automatically assign a maintainer for review.

We are also testing AI-based code review tools (e.g., Greptile), which may add automated comments with a confidence score.
This score reflects the AI’s assessment of merge readiness and is not a qualitative judgment of your work, nor is
it an indication that the PR will be accepted / rejected.

AI-generated feedback should be reviewed critically for usefulness.
You are not required to respond to every AI comment, but they are intended to help both authors and reviewers.
Please react to Greptile comments with 👍 or 👎 to provide feedback on their accuracy.

@CharlelieLrt CharlelieLrt added 3 - Ready for Review Ready for review by team and removed 5 - DO NOT MERGE Hold off on merging; see PR for details 5 - Merge After Dependencies Depends on another PR: do not merge out of order labels Dec 19, 2025
@CharlelieLrt CharlelieLrt changed the title [DRAFT-DO NOT REVIEW] Restructure diffusion subpackage Restructure diffusion subpackage Dec 19, 2025
Signed-off-by: Charlelie Laurent <[email protected]>
Signed-off-by: Charlelie Laurent <[email protected]>
Signed-off-by: Charlelie Laurent <[email protected]>
# from .YParams import YParams
# from .dataset import Era5Dataset, CWBDataset, CWBERA5DatasetV2, ZarrDataset

# ----------------------------------------------------------------------------
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LOL ☝️

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Did this somehow get removed from Corey's v2.0 PR, and you are reintroducing it here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The git merge was an absolute mess, so it's very possible I made mistake...
But after merging with Corey's v2.0PR what I got here was an empty file with an unused import os. I assume Corey just forgot to remove it? So I deleted the unused import
@coreyjadams for viz

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Well I don't see a physicsnemo.compat in main branch at the moment so I think it was somehow dropped from the big v2.0 PR?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah my rebase onto main after the merge was also a disaster. I think squashing commits makes it impossible for git to actually merge properly, and we had too many squashed?

I had removed compat from v2.0 - I thought it was breaking doc tests (I no longer think this) but I also thought, "is this really useful?". Anyways it sounds like there is more will to reintroduce it, I won't stand in the way.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's def useful as it'll be another tool in our belt for helping people migrate

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, I'll bring it back in a standalone PR. @CharlelieLrt is innocent in this one!

class Attention(torch.nn.Module):
"""
Self-attention block used in U-Net-style architectures, such as DDPM++, NCSN++, and ADM.
Applies GroupNorm followed by multi-head self-attention and a projection layer.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't love having the module named nn.attention_layers.Attention be this relatively model-specific U-Net version -- seems like such a module should be relatively general. Thoughts on renaming it?

I still have mixed feelings about the 'spell out any number less than 10', but it is the style guide rule that I am contractually required to make.  The only way around it is for it to be in code font....
I want to init-cap all the 'utils' in the headings.....but it could be considered a code thing, so I am leaving it.
@megnvidia
Copy link
Collaborator

I had small changes to the two rst files. If those get merged well, I approve.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

3 - Ready for Review Ready for review by team

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants