Skip to content
This repository was archived by the owner on Jun 5, 2025. It is now read-only.
This repository was archived by the owner on Jun 5, 2025. It is now read-only.

Muxing rule errors with Ollama #1178

Closed
stacklok/codegate-ui
#360
@danbarr

Description

@danbarr

Describe the issue

Trying to save the muxing rules for a workspace that had previously been set to an Ollama model, I get an error in v0.1.25:

Model qwen2.5-coder does not exist for provider 6bb9d2d0-57df-4925-9878-8b4287f45a2a

I've tried deleting and re-adding the ollama provider. The available models are successfully loaded in the drop-down, but when I select one and save I get the above error. It seems the :version part of the model name is being dropped, for example I don't see the :1.5b in the input box once I select it and it's not there in the error message.

Other providers are working (I tested OpenAI, OpenRouter, and LM Studio)

Steps to Reproduce

Configure an Ollama provider. Try to select one of the models from that provider in a workspace muxing rule. Try to save the muxing rules.

Operating System

MacOS (Arm)

IDE and Version

NA

Extension and Version

NA

Provider

Ollama

Model

All

Codegate version

v0.1.25

Logs

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions