-
Notifications
You must be signed in to change notification settings - Fork 537
Support Cache Class for Even Newer Versions of Transformers Library #1343
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…#1341) Summary: Pull Request resolved: pytorch#1341 Fixes D62210529 (now reverted by D62262760). Transformers library is now an optional dependency. We do not depend on it, however, we have some logic for `transformers` models here. The library will only be imported if a model already has the library in the corresponding environment. This TARGETS configuration prevents transformers version conflicts which e.g. caused T200877742. Add support for new transformers Cache objects. This may need changes in the future as it seems that LLMs handle Caching differently. Some handle Caching themselves, however, some of them do not and some of them don't support Caches yet. Llama models seem to have a `_supports_cache_class` flag that indicates whether this new Cache object is supported. If it isn't marked as supported, we assume it takes legacy format (tuple past values). Multiple checks added to ensure compatibility. (minor) Also, changed the defaults for LLM generation to dismiss warnings (does not change generation behavior). Differential Revision: D62408520
This pull request was exported from Phabricator. Differential Revision: D62468332 |
This pull request was exported from Phabricator. Differential Revision: D62468332 |
…ytorch#1343) Summary: Pull Request resolved: pytorch#1343 Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions. Differential Revision: D62468332
a9024bd
to
aa9cee0
Compare
This pull request was exported from Phabricator. Differential Revision: D62468332 |
…ytorch#1343) Summary: Pull Request resolved: pytorch#1343 Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions. Differential Revision: D62468332
aa9cee0
to
80867c6
Compare
This pull request was exported from Phabricator. Differential Revision: D62468332 |
…ytorch#1343) Summary: Pull Request resolved: pytorch#1343 Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions. Differential Revision: D62468332
80867c6
to
636bd1c
Compare
…ytorch#1343) Summary: Pull Request resolved: pytorch#1343 Supports multiple and newer versions of the transformers library. Adds the `packaging` dependency as well to more robustly check package versions. Differential Revision: D62468332
This pull request was exported from Phabricator. Differential Revision: D62468332 |
636bd1c
to
8f22fa1
Compare
This pull request has been merged in 5839d52. |
Summary: Supports multiple and newer versions of the transformers library. Adds the
packaging
dependency as well to more robustly check package versions.Differential Revision: D62468332