Skip to content

Conversation

kapunga
Copy link
Contributor

@kapunga kapunga commented Dec 7, 2024

When using the library with a locally running instance of Ollama, a DeserializationOpenAIException is thrown when passing a custom model.

This PR deserializes to a custom model where applicable to avoid this exception.

@adamw adamw merged commit ca4b48f into softwaremill:master Dec 7, 2024
5 checks passed
@adamw
Copy link
Member

adamw commented Dec 7, 2024

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants