Skip to content

WEBUI bug, I can't choose Model Engine , so I can't download LLM models #1928

@zzlTim

Description

@zzlTim

System Info / 系統信息

cuda 12.1
python 3.10.1

Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?

  • docker / docker
  • pip install / 通过 pip install 安装
  • installation from source / 从源码安装

Version info / 版本信息

xinference, version 0.13.0

The command used to start Xinference / 用以启动 xinference 的命令

启动服务

xinference-local --host 0.0.0.0 --port 9997

Reproduction / 复现过程

just open http://localhost:9997/ui/#/launch_model/llm
屏幕截图(1)

Expected behavior / 期待表现

fix the bug, so I can download llms

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions