Skip to content

Conversation

mrubens
Copy link
Collaborator

@mrubens mrubens commented Jul 15, 2025

Important

Centralizes max-token calculation logic using getModelMaxOutputTokens and updates default maxTokens to 8192 across models and tests.

  • Behavior:
    • Updates maxTokens to 8192 for models in groq.ts.
    • Uses getModelMaxOutputTokens in getModelParams in model-params.ts for centralized maxTokens logic.
    • Adjusts default maxTokens to 8192 in calculateTokenDistribution in model-utils.ts.
  • Tests:
    • Updates tests in api.spec.ts, ContextWindowProgressLogic.spec.ts, and model-utils.spec.ts to reflect new default maxTokens value of 8192.
    • Ensures tests handle edge cases like negative inputs and zero context window correctly.

This description was created by Ellipsis for 918b5ca. You can customize this summary. It will automatically update as commits are pushed.

@mrubens mrubens requested review from cte and jr as code owners July 15, 2025 05:41
@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Jul 15, 2025
Copy link

delve-auditor bot commented Jul 15, 2025

No security or compliance issues detected. Reviewed everything up to 918b5ca.

Security Overview
  • 🔎 Scanned files: 7 changed file(s)
Detected Code Changes
Change Type Relevant files
Enhancement ► groq.ts
    Update maxTokens values to 8192 for all Groq models
► model-params.ts
    Refactor to use centralized maxTokens logic
► api.ts
    Update token calculation logic
► model-utils.ts
    Update default token distribution calculation
► Various test files updated to reflect new logic

Reply to this PR with @delve-auditor followed by a description of what change you want and we'll auto-submit a change to this PR to implement it.

@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jul 15, 2025
@hannesrudolph hannesrudolph added the Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. label Jul 15, 2025
@mrubens mrubens force-pushed the dry_up_max_token_calcs branch 2 times, most recently from 44a6830 to be3f174 Compare July 15, 2025 06:01
@mrubens mrubens force-pushed the dry_up_max_token_calcs branch from be3f174 to 6a91e9f Compare July 15, 2025 06:06
@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Jul 15, 2025
@mrubens mrubens force-pushed the dry_up_max_token_calcs branch from 6a91e9f to 918b5ca Compare July 15, 2025 06:09
@mrubens mrubens merged commit 8a3dcfb into main Jul 15, 2025
11 checks passed
@mrubens mrubens deleted the dry_up_max_token_calcs branch July 15, 2025 06:20
@github-project-automation github-project-automation bot moved this from Triage to Done in Roo Code Roadmap Jul 15, 2025
@github-project-automation github-project-automation bot moved this from New to Done in Roo Code Roadmap Jul 15, 2025
fxcl added a commit to tameslabs/Roo-Cline that referenced this pull request Jul 16, 2025
* main:
  fix: Resolve confusing auto-approve checkbox states (RooCodeInc#5602)
  fix: prevent empty mode names from being saved (RooCodeInc#5766) (RooCodeInc#5794)
  Format time in ISO 8601 (RooCodeInc#5793)
  fix: resolve DirectoryScanner memory leak and improve file limit handling (RooCodeInc#5785)
  Fix settings dirty check (RooCodeInc#5779)
  feat: increase Ollama API timeout values and extract as constants (RooCodeInc#5778)
  fix: Exclude Terraform and Terragrunt cache directories from checkpoints (RooCodeInc#4601) (RooCodeInc#5750)
  Move less commonly used provider settings into an advanced dropdown (RooCodeInc#5762)
  feat: Add configurable error & repetition limit with unified control (RooCodeInc#5654) (RooCodeInc#5752)
  list-files must include at least the first-level directory contents (RooCodeInc#5303)
  Update evals repo link (RooCodeInc#5758)
  Feature/vertex ai model name conversion (RooCodeInc#5728)
  fix(litellm): handle baseurl with paths correctly (RooCodeInc#5697)
  Add telemetry for todos (RooCodeInc#5746)
  feat: add undo functionality for enhance prompt feature (fixes RooCodeInc#5741) (RooCodeInc#5742)
  Fix max_tokens limit for moonshotai/kimi-k2-instruct on Groq (RooCodeInc#5740)
  Changeset version bump (RooCodeInc#5735)
  Add changeset for v3.23.12 patch release (RooCodeInc#5734)
  Update the max-token calculation in model-params to use the shared logic (RooCodeInc#5720)
  Changeset version bump (RooCodeInc#5719)
  chore: add changeset for v3.23.11 patch release (RooCodeInc#5718)
  Add Kimi K2 model and better support (RooCodeInc#5717)
  Fix: Remove invalid skip-checkout parameter from GitHub Actions workflows (RooCodeInc#5676)
  feat: add Cmd+Shift+. keyboard shortcut for previous mode switching (RooCodeInc#5695)
  Changeset version bump (RooCodeInc#5708)
  chore: add changeset for v3.23.10 patch release (RooCodeInc#5707)
  Add padding to the index model options (RooCodeInc#5706)
  fix: prioritize built-in model dimensions over custom dimensions (RooCodeInc#5705)
  Update CHANGELOG.md
  Changeset version bump (RooCodeInc#5702)
  chore: add changeset for v3.23.9 patch release (RooCodeInc#5701)
  Tweaks to command timeout error (RooCodeInc#5700)
  Update contributors list (RooCodeInc#5639)
  feat: enable Claude Code provider to run natively on Windows (RooCodeInc#5615)
  feat: Add configurable timeout for command execution (RooCodeInc#5668)
  feat: add gemini-embedding-001 model to code-index service (RooCodeInc#5698)
  fix: resolve vector dimension mismatch error when switching embedding models (RooCodeInc#5616) (RooCodeInc#5617)
  fix: [5424] return the cwd in the exec tool's response so that the model is not lost after subsequent calls (RooCodeInc#5667)
  Changeset version bump (RooCodeInc#5670)
  chore: add changeset for v3.23.8 patch release (RooCodeInc#5669)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Issue/PR - Triage New issue. Needs quick review to confirm validity and assign labels. lgtm This PR has been approved by a maintainer size:L This PR changes 100-499 lines, ignoring generated files.
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

3 participants