Skip to content

Support Cache Class for Even Newer Versions of Transformers Library #1343

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from

Conversation

craymichael
Copy link
Contributor

Summary: Supports multiple and newer versions of the transformers library. Adds the packaging dependency as well to more robustly check package versions.

Differential Revision: D62468332

…#1341)

Summary:
Pull Request resolved: pytorch#1341

Fixes D62210529 (now reverted by D62262760). Transformers library is now an optional dependency. We do not depend on it, however, we have some logic for `transformers` models here. The library will only be imported if a model already has the library in the corresponding environment. This TARGETS configuration prevents transformers version conflicts which e.g. caused T200877742.

Add support for new transformers Cache objects. This may need changes in the future as it seems that LLMs handle Caching differently. Some handle Caching themselves, however, some of them do not and some of them don't support Caches yet. Llama models seem to have a `_supports_cache_class` flag that indicates whether this new Cache object is supported. If it isn't marked as supported, we assume it takes legacy format (tuple past values). Multiple checks added to ensure compatibility.

(minor) Also, changed the defaults for LLM generation to dismiss warnings (does not change generation behavior).

Differential Revision: D62408520
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62468332

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62468332

craymichael added a commit to craymichael/captum that referenced this pull request Sep 10, 2024
…ytorch#1343)

Summary:
Pull Request resolved: pytorch#1343

Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions.

Differential Revision: D62468332
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62468332

craymichael added a commit to craymichael/captum that referenced this pull request Sep 10, 2024
…ytorch#1343)

Summary:
Pull Request resolved: pytorch#1343

Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions.

Differential Revision: D62468332
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62468332

craymichael added a commit to craymichael/captum that referenced this pull request Sep 10, 2024
…ytorch#1343)

Summary:
Pull Request resolved: pytorch#1343

Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions.

Differential Revision: D62468332
…ytorch#1343)

Summary:
Pull Request resolved: pytorch#1343

Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions.

Differential Revision: D62468332
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D62468332

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 5839d52.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants