Description
I was randomly seeing benchmarks report 0 ns/iter and had no idea why until @alexcrichton clued me in on the 1ms limit per iter. I'd like to add a warning when the initial iteration takes longer than 1ms so the user knows why the benchmark is "failing". Instead of just reporting 0 ns/iter.
I've got a question on the intent of part of the code before I continue trying to make this change.
https://github.com/mozilla/rust/blob/master/src/libextra/test.rs#L1060
if self.ns_per_iter() == 0 {
n = 1_000_000;
} else {
n = 1_000_000 / self.ns_per_iter().max(&1);
}
The .max(&1) seems needless because self.ns_per_iter can only return positive integers and it was just checked for 0. I'm wondering if the line should actually be:
n = (1_000_000 / self.ns_per_iter()).max(&1);
To guarantee at least one iteration is run. If zero iterations is fine, then I'd like to add a check for that and print a warning instead.