Skip to content

Conversation

RyanCavanaugh
Copy link
Member

@RyanCavanaugh RyanCavanaugh commented Mar 11, 2024

This adds automatic baselines similar to what you get from --extendedDiagnostics whenever the harness detects an "interesting" amount of something happening. To reduce baseline noise from future changes, these values are rounded to metric-specific intervals, so e.g. if we go from 513 to 517 symbols, that's not going to register a baseline diff (it will print back as "500").

This should let us easily create "don't blow stuff up" tests when things like #57710 happen.

I want to make this opt-in/opt-out on a per-test base but the harness code is not well-suited to threading that info through and I want to do a follow-up refactor that cleans up the baseliner to have access to the per-test options.

@typescript-bot typescript-bot added Author: Team For Uncommitted Bug PR for untriaged, rejected, closed or missing bug labels Mar 11, 2024
@DanielRosenwasser
Copy link
Member

DanielRosenwasser commented Mar 15, 2024

What happens if a test uses // @noTypeAndSymbolsBaselines or whatever the flag is called? I guess we just don't care usually?

@RyanCavanaugh RyanCavanaugh merged commit 6011176 into microsoft:main Mar 15, 2024
@RyanCavanaugh RyanCavanaugh deleted the perf-baseline-stats branch March 15, 2024 20:31
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Author: Team For Uncommitted Bug PR for untriaged, rejected, closed or missing bug
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants