-
Notifications
You must be signed in to change notification settings - Fork 40
Refactor model URL constants #430
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Refactor model URL constants #430
Conversation
This is what I meant to change the version tag. Right now, we have: const VERSION_TAG = 'resolve/v0.4.0';
const NEXT_DEV_VERSION_TAG = 'resolve/v0.5.0'; And every time new version is release, we need to change the version of |
This is good suggestion, we should probably complete it alongside with : #359 |
@pweglik I don't know the list of models that require such presets from #359 . If you could send me some instructions I can implement this as e.g. present for LLM will have model, tokenizer, and config but text embeddings will have model, tokenizer, and mean pooling parameters. Otherwise, feel free to contribute to this PR. |
You can try to match it by names, that should be fairly straight forward, for speech-to-text it is already defined, so that's that. You can add mean pooling falg alongside that. Regarding how to do that I think a simple json for all models on our huggingface as a starting point would be just fine. I am not sure how to generalize it beyond our models. |
Co-authored-by: Norbert Klockiewicz <[email protected]>
Currently, the whole codebase must be revised because there are situations where there is the following ussage: await ImageSegmentationModule.load(DEEPLAB_V3_RESNET50); and this one: const model = useImageSegmentation({
modelSource: DEEPLAB_V3_RESNET50,
}); Also onDownloadProgressCallback is used is a way that we currently do not support in this issue: await LLMModule.load({
modelSource: LLAMA3_2_1B_QLORA,
tokenizerSource: LLAMA3_2_TOKENIZER,
tokenizerConfigSource: LLAMA3_2_TOKENIZER_CONFIG,
onDownloadProgressCallback: printDownloadProgress,
}); Ideally, we need to change every single usage in the codebase, to use presets, that are exposed to the end users. |
@NorbertKlockiewicz do you have any idea how to deal with this? |
Regarding this:
I think we can simply change it to:
Regarding the other issue - yeah, I think we need to go through every place in docs and example apps and change it |
Description
Same as #382, but now I have no permission to push directly into separate branch on repo so changing to PR from fork.