Skip to content

Conversation

samanklesaria
Copy link
Collaborator

Now that deprecated functionality has been removed, we can remove the restrictions given when running the test suite during CI. This PR depends on the functionality in the branch pseudo_main.

Copy link

pytorch-bot bot commented Aug 20, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/audio/4061

Note: Links to docs will display an error until the docs builds have been completed.

❗ 1 Active SEVs

There are 1 currently active SEVs. If your PR is affected, please view them below:

❌ 8 New Failures

As of commit 88515c1 with merge base 7d4c0bf (image):

NEW FAILURES - The following jobs have failed:

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed label Aug 20, 2025
@samanklesaria
Copy link
Collaborator Author

samanklesaria commented Aug 21, 2025

The torchscript consistency tests don't seem to work with manual autograd in python (torch.autograd.Function overloads). Pretty much all the torchscript consistency checks seem to have this issue. The only way forward is to either:
A) Investigate more whether it's possible to get torch.autograd.Function working with torchscript
B) Delete the torchscript consistency checks.
As torchscript is essentially deprecated anyway, I propose option (B)

@pearu
Copy link
Collaborator

pearu commented Aug 28, 2025

It is possible to fix torchscript consistency tests by avoiding autograd functions under jit-scripting. For instance, rnnt_loss tests pass when applying the following patch:

diff --git a/src/torchaudio/functional/functional.py b/src/torchaudio/functional/functional.py
index 5d4284d5..a72c6091 100644
--- a/src/torchaudio/functional/functional.py
+++ b/src/torchaudio/functional/functional.py
@@ -1821,7 +1821,13 @@ def _rnnt_loss(
     if blank < 0:  # reinterpret blank index if blank < 0.
         blank = logits.shape[-1] + blank
 
-    costs = RnntLoss.apply(
+    if torch.jit.is_scripting():
+        # TorchScript does not support autograd functions
+        func = torch.ops.torchaudio.rnnt_loss_forward
+    else:
+        func = RnntLoss.apply
+
+    costs = func(
         logits,
         targets,
         logit_lengths,
@@ -1831,6 +1837,9 @@ def _rnnt_loss(
         fused_log_softmax
     )
 
+    if torch.jit.is_scripting():
+        costs = costs[0]
+
     if reduction == "mean":
         return costs.mean()
     elif reduction == "sum":

@samanklesaria samanklesaria changed the base branch from pseudo_main to main September 2, 2025 15:27
@pearu pearu added this to the 2.9 milestone Sep 2, 2025
Copy link
Collaborator

@pearu pearu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks, @samanklesaria!

@samanklesaria samanklesaria marked this pull request as ready for review September 2, 2025 18:27
@samanklesaria samanklesaria requested a review from a team as a code owner September 2, 2025 18:27
@samanklesaria samanklesaria merged commit 0757bbb into main Sep 2, 2025
42 of 43 checks passed
@samanklesaria samanklesaria deleted the enable_all_tests branch September 2, 2025 18:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants