Skip to content

UPSTREAM PR #18478: chat: make tool description and parameters optional per OpenAI spec#756

Open
loci-dev wants to merge 1 commit intomainfrom
upstream-PR18478-branch_Anri-Lombard-optional-tool-parameters
Open

UPSTREAM PR #18478: chat: make tool description and parameters optional per OpenAI spec#756
loci-dev wants to merge 1 commit intomainfrom
upstream-PR18478-branch_Anri-Lombard-optional-tool-parameters

Conversation

@loci-dev
Copy link
Copy Markdown

Mirrored from ggml-org/llama.cpp#18478

Summary

  • Makes description and parameters fields optional in tool function definitions
  • Per the OpenAI API specification, these fields are optional
  • Previously, the parser would throw an exception if these fields were missing

Test plan

  • Added test for tools without parameters field
  • Added test for tools without description field
  • Added test for tools with only name (minimal valid tool)
  • Existing tests still pass

Attempts to fix #17667

Per the OpenAI API specification, both 'description' and 'parameters'
fields in tool function definitions are optional. Previously, the parser
would throw an exception if these fields were missing.

Attempts to fix #17667
@loci-review
Copy link
Copy Markdown

loci-review Bot commented Dec 30, 2025

Explore the complete analysis inside the Version Insights

Perfect! I've generated a comprehensive summary report for your project. Here are the key highlights:

🎯 Key Findings:

Overall Performance: The analysis shows significant improvements across all measured functions, with response time increases ranging from 40% to 216% and throughput gains from 74% to 289%.

Top Performer: The begin function in llama-cvector-generator showed the most dramatic improvement with a 215.92% response time increase and 289.34% throughput increase.

Affected Areas: All top 10 functions are STL (Standard Template Library) operations, primarily:

  • Vector operations (begin, end, back, empty)
  • Iterator operations
  • Memory management functions

Binaries Impacted:

  • llama-cvector-generator (4 functions)
  • llama-run (4 functions)
  • llama-tts (3 functions)

The report indicates this is a very positive change with no detected performance regressions. The improvements are consistent across multiple binaries, suggesting core library or compiler optimization enhancements.

@loci-dev loci-dev force-pushed the main branch 27 times, most recently from 86bf5db to 07aff19 Compare January 2, 2026 17:07
@loci-dev loci-dev force-pushed the main branch 30 times, most recently from 2517152 to 2365455 Compare January 8, 2026 15:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants