Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

bug: some models failed to load if many GPU are selected #1458

Open
@thonore75

Description

@thonore75

Will be fixed by


Original Bug report:

Jan version

0.5.3

Describe the Bug

I imported many models and for some of them, they are failing to load if I selected my both graphic cards (RTX 3060 12Go).
If I unselect one of them, the model is loaded.

It will be great if the models list could indicate if the models are supporting multi-GPU

Steps to Reproduce

  1. Go to Settings -> Advanced Settings
  2. In Choose device(s), select 2 GPUs
  3. Go in "My Models"
  4. Select "Meta-Llama-3.1-8B-Instruct-128k-Q4_0" and start it -> NOT loaded !!!
  5. Go in Advanced Settings
  6. Unselect one GPU from "Choose device(s)"
  7. Go in "My Models"
  8. Select "Meta-Llama-3.1-8B-Instruct-128k-Q4_0" and start it -> loaded !!!

Screenshots / Logs

No response

What is your OS?

  • MacOS
  • Windows
  • Linux

Metadata

Metadata

Assignees

Labels

category: model runningInference ux, handling context/parameters, runtimeneeds infoNeeds more logs, steps to help reproduceos: Windowstype: bugSomething isn't working

Type

No type

Projects

Status

Completed

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions