Skip to content

fix(runner): correct per-test duration for concurrent tests#10072

Open
cyphercodes wants to merge 1 commit intovitest-dev:mainfrom
cyphercodes:fix-concurrent-test-duration
Open

fix(runner): correct per-test duration for concurrent tests#10072
cyphercodes wants to merge 1 commit intovitest-dev:mainfrom
cyphercodes:fix-concurrent-test-duration

Conversation

@cyphercodes
Copy link
Copy Markdown

Description

Fixes incorrect test duration reporting when tests run concurrently with maxConcurrency.

Problem

When tests run concurrently (e.g., with describe.concurrent and maxConcurrency: 1), the reported per-test duration includes time spent waiting for a concurrency slot, not just the actual test execution time. This causes tests that take ~10s each to be reported as taking ~20s each.

Solution

Capture the start time right before the test function executes (after acquiring the concurrency slot) instead of at the beginning of runTest. This ensures the duration reflects actual test execution time.

Changes

  • Added executionStart variable to track when test actually starts executing
  • Set executionStart right before limitMaxConcurrency call
  • Use executionStart for duration calculation instead of start

Fixes #10069

Fixes incorrect test duration reporting when tests run concurrently
with maxConcurrency. The duration now measures actual test execution
time instead of including time spent waiting for concurrency slots.

Fixes vitest-dev#10069
@netlify
Copy link
Copy Markdown

netlify bot commented Apr 5, 2026

Deploy Preview for vitest-dev ready!

Built without sensitive environment variables

Name Link
🔨 Latest commit 7110e84
🔍 Latest deploy log https://app.netlify.com/projects/vitest-dev/deploys/69d285c2fce4e5000857d790
😎 Deploy Preview https://deploy-preview-10072--vitest-dev.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify project configuration.

@sheremet-va sheremet-va added the maybe automated User is likely an AI agent, or the content was generated by an AI assistant without user control label Apr 5, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

maybe automated User is likely an AI agent, or the content was generated by an AI assistant without user control

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Reporting regression in V4: incorrect per-test duration with concurrent

2 participants