Open

Description
LlamaSharp config from appsettings.json
"LlamaSharp": {
"Interactive": true,
"ModelDir": "C:\\Models\\TheBloke\\CodeLlama-7B-GGUF",
"DefaultModel": "codellama-7b.Q5_K_M.gguf",
"MaxContextLength": 1024,
"NumberOfGpuLayer": 20
},
LLM Provider for LlamaSharp:
{
"Provider": "llama-sharp",
"Models": [
{
"Name": "codellama-7b.Q5_K_M.gguf",
"Type": "chat"
}
]
}
Agent configuration:
{
"id": "ddf46fa4-5686-408a-a574-b3da43f3ed99",
"name": "test-agent",
"description": "generic agent to test chat functionality using locally installed LLM",
"instruction": "",
"templates": [],
"functions": [],
"responses": [],
"samples": [],
"is_public": false,
"is_router": false,
"allow_routing": false,
"disabled": false,
"icon_url": null,
"profiles": [],
"routing_rules": [],
"llm_config": {
"is_inherit": true,
"provider": "llama-sharp",
"model": "codellama-7b.Q5_K_M.gguf"
},
"plugin": {
"id": "00000000-0000-0000-0000-000000000000",
"name": "BotSharp",
"description": null,
"assembly": "BotSharp.Core",
"icon_url": null,
"agent_ids": [
"ddf46fa4-5686-408a-a574-b3da43f3ed99"
],
"enabled": true,
"menus": null
},
"created_datetime": "2024-01-15T17:19:45.2410911Z",
"updated_datetime": "2024-01-15T17:19:45.2410912Z"
}
Message Sent:
{
"text": "hello world"
}
To allow CPU initialization, 'LlamaSharp.Backend.Cpu' is referenced.
Response:
你好,我是小芭。您可以对我说“小芭,帮我做什么”,我会尽力帮助您。
Rough translation:
Hello, I am Xiaoba. You can say to me "Xiaoba, what can I do for you?" and I will try my best to help you.
This response comes back for everything. The model have been tested in LM studio and doesn't appear to have any affinity for Chinese, What am I doing wrong?