-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Closed
Labels
theia-aiissues related to TheiaAIissues related to TheiaAI
Description
Feature Description:
AI agents need configurable settings for LLM parameters (e.g., temperature, thinking time), but it's unclear where to manage them—globally, per model, per agent, or per request.
Currently the Agent class can programmatically set them.
Possible Solutions (Briefly):
Hierarchical Settings: Global → Model → Agent → Chat → Request.
Model-Specific Properties: The model should expose modifiable parameters for users those can then be set per Agent in the Configuration
Chat UI Controls: A settings icon in chat to modify llm settings at request time.
Metadata
Metadata
Assignees
Labels
theia-aiissues related to TheiaAIissues related to TheiaAI