-
Notifications
You must be signed in to change notification settings - Fork 20.3k
Add oobabooga/text-generation-webui support as a llm #5997
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
hwchase17
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lets add an example notebook as well
Thanks for the review. I've fixed the Black formatting and have added an example notebook. |
|
Can a maintainer kick off the workflows? I believe I've fixed the previous errors. |
|
@hwchase17 can you help kick off the workflows for this PR? I believe I've fixed up all of the errors. |
hwchase17
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
looks reasonable, thanks!
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Add oobabooga/text-generation-webui support as an LLM. Currently, supports using text-generation-webui's non-streaming API interface. Allows users who already have text-gen running to use the same models with langchain.
Before submitting
Simple usage, similar to existing LLM supported:
Who can review?
@hwchase17 - project lead