Skip to content

Conversation

@janpawellek
Copy link
Contributor

LLM configurations can be loaded from a Python dict (or JSON file deserialized as dict) using the load_llm_from_config function.

However, the type string in the type_to_cls_dict lookup dict differs from the type string defined in some LLM classes. This means that the LLM object can be saved, but not loaded again, because the type strings differ.

Copy link
Contributor

@hwchase17 hwchase17 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks

@hwchase17 hwchase17 added the lgtm label Jun 19, 2023
@hwchase17 hwchase17 merged commit 3e3ed8c into langchain-ai:master Jun 19, 2023
This was referenced Jun 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants