Skip to content

fix(llm): use generic model fields in frontend instead of OllamaModels#1093

Merged
Alanxtl merged 1 commit into
apache:mainfrom
Harry33t:fix/1072-llm-frontend-non-ollama-panic
May 10, 2026
Merged

fix(llm): use generic model fields in frontend instead of OllamaModels#1093
Alanxtl merged 1 commit into
apache:mainfrom
Harry33t:fix/1072-llm-frontend-non-ollama-panic

Conversation

@Harry33t
Copy link
Copy Markdown
Contributor

@Harry33t Harry33t commented May 4, 2026

修复:非 Ollama 提供商导致前端首页访问 panic

问题

LLM_PROVIDER 不是 ollama 时(例如通过 OpenAI 兼容接口使用 openai、gemini),打开前端首页会触发 panic:

runtime error: index out of range [0] with length 0

错误堆栈指向 llm/go-client/frontend/main.go

"DefaultModel": cfg.OllamaModels[0],

根本原因

config.go 仅在 LLM 提供商为 ollama 时才会填充 OllamaModels

if config.LLMProvider == "ollama" {
    config.OllamaModels = modelsList
}

因此对于其他提供商,该切片为空,访问索引 [0] 会触发 panic。

CLI 已经使用通用的 cfg.LLMModelsListcfg.ModelName,前端部分未同步适配。

修复方案

  • main.go:将 cfg.LLMModelsList 作为 Models 传入,cfg.ModelName 作为 DefaultModel 传入
  • index.html:遍历 .Models,将配置的 MODEL_NAME 标记为 selected,确保初始下拉框与用户 .env 中的配置匹配

验证

使用 #1072 中的复现配置进行本地测试:

LLM_PROVIDER = openai
LLM_MODELS = gemini-2.5-flash
LLM_BASE_URL = https://generativelanguage.googleapis.com/v1beta/openai/
MODEL_NAME = gemini-2.5-flash
  • GET / 返回 HTTP 200,无 panic
  • 下拉框渲染正常:<option value="gemini-2.5-flash" selected>gemini-2.5-flash</option>
image

Ollama 路径不受影响:LLMModelsListModelNameconfig.go 中对所有 provider 都会填充,因此当 provider 是 ollama 时,渲染结果与原来用 OllamaModels 完全一致,无回归。

Closes #1072

@Harry33t
Copy link
Copy Markdown
Contributor Author

Harry33t commented May 4, 2026

CI 的 lint 失败是 book-flight-ai-agent/go-server/tools/tool_base.go:64reflect.Ptr 应改为 reflect.Pointer(govet 提示)。
这是仓库里早已存在的问题,由 commit 6c49cca (2025-05-15) 引入,与本 PR 的改动完全无关。

@Alanxtl
Copy link
Copy Markdown
Contributor

Alanxtl commented May 5, 2026

merge to main to fix ci fail

The frontend index page indexed cfg.OllamaModels[0] directly, which
panics with "index out of range [0] with length 0" when LLM_PROVIDER
is not ollama (config.go only populates OllamaModels for the ollama
provider). The CLI already uses cfg.LLMModelsList / cfg.ModelName,
so align the frontend with the same generic fields.

Also mark the configured MODEL_NAME as the selected option in the
dropdown so the initial display matches the user's .env choice.

Closes apache#1072
@Harry33t Harry33t force-pushed the fix/1072-llm-frontend-non-ollama-panic branch from 981c60a to 1bfe9b3 Compare May 9, 2026 14:23
@Alanxtl Alanxtl merged commit 4bd4b83 into apache:main May 10, 2026
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

llm 前端在非 Ollama provider 下访问首页会 panic

2 participants