Skip to content

Initial version of async attribution with torch.futures #1295

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

yucu
Copy link
Contributor

@yucu yucu commented Jun 6, 2024

Differential Revision: D56764316

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 11, 2024
Summary: Pull Request resolved: pytorch#1295

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from bd7fe5d to 1158171 Compare June 11, 2024 00:02
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 11, 2024
Summary: Pull Request resolved: pytorch#1295

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 1158171 to 6833703 Compare June 11, 2024 05:46
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 11, 2024
Summary: Pull Request resolved: pytorch#1295

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 6833703 to f33b04f Compare June 11, 2024 06:17
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 11, 2024
Summary: Pull Request resolved: pytorch#1295

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from f33b04f to 3ca02ff Compare June 11, 2024 07:14
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 11, 2024
Summary: Pull Request resolved: pytorch#1295

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 3ca02ff to c284661 Compare June 11, 2024 19:07
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

@yucu yucu force-pushed the export-D56764316 branch from c284661 to a3d5bc9 Compare June 11, 2024 20:40
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 12, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from a3d5bc9 to 6eced7a Compare June 12, 2024 21:15
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 13, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 6eced7a to fd84663 Compare June 13, 2024 22:19
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 14, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from fd84663 to a2aacd0 Compare June 14, 2024 06:54
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 14, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from a2aacd0 to 2a4e173 Compare June 14, 2024 07:01
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

@yucu yucu force-pushed the export-D56764316 branch from 2a4e173 to 34d8bff Compare June 29, 2024 00:33
yucu added a commit to yucu/captum that referenced this pull request Jun 29, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jun 29, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 34d8bff to 1507ca4 Compare June 29, 2024 02:14
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jul 1, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 1507ca4 to 4dc5de4 Compare July 1, 2024 17:10
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jul 1, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 4dc5de4 to dfd889e Compare July 1, 2024 19:03
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

@yucu yucu force-pushed the export-D56764316 branch from dfd889e to 7e0b198 Compare July 1, 2024 19:22
yucu added a commit to yucu/captum that referenced this pull request Jul 1, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

yucu added a commit to yucu/captum that referenced this pull request Jul 1, 2024
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@yucu yucu force-pushed the export-D56764316 branch from 7e0b198 to 66c71f7 Compare July 1, 2024 19:38
Summary:
Pull Request resolved: pytorch#1295

Currently Captum doesn't support async forward functions. Ads R&P team would like this feature in order to replace their custom variant (D56655643) of Feature Ablation with Captum and maintain similar performance.

PyTorch introduce future concepts ([link](https://pytorch.org/docs/stable/futures.html)) so we can adopt it for feature_ablation.py as the first step.

Details:
- Initial evaluation returns a future, save it.
- Each evaluation for each feature for each input will returns an attribution result (plus corresponding weight if applicable), save all those result separately since futures cannot be added up directly.
- When all futures above are done. we can add up the evaluation result to the final outcome as one Tensor per input.
- Since common._run_forward is used by other attribution methods, need to do some type hacking over there. But if users attempt to use those methods async, they will end up in failure before Captum support async for those methods.

TODO: Extend FeatureAttributor to support `torch.futures`

Differential Revision: D56764316
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D56764316

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 3543414.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants