Skip to content

Extend test timeout for allocation*.js and array_slice.js which can run longer on busy or low-resource test VMs. #1421

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Aug 11, 2016

Conversation

dilijev
Copy link
Contributor

@dilijev dilijev commented Aug 11, 2016

We've observed them run for around 120 seconds; increasing the timeout to 300 seconds (5 minutes) should more than account for this.

Fixes #1406
Fixes #1417

@dilijev
Copy link
Contributor Author

dilijev commented Aug 11, 2016

FYI @ianwjhalliday

…un longer on busy or low-resource test VMs.

We've observed them run for around 120 seconds; increasing the timeout to 300 seconds (5 minutes) should more than account for this.

Fixes chakra-core#1406 and chakra-core#1417
@dilijev dilijev changed the title Extend timeout for test/typedarray/allocation*.js tests which can timeout on busy or low-resource test VMs. Extend test timeout for allocation*.js and array_slice.js which can run longer on busy or low-resource test VMs. Aug 11, 2016
@dilijev
Copy link
Contributor Author

dilijev commented Aug 11, 2016

@dotnet-bot test Windows x86_test please

@obastemur
Copy link
Collaborator

@dilijev

We've observed them run for around 120 seconds; increasing the timeout to 300 seconds (5 minutes) should more than account for this.

I'm confused with this and your previous comments on #1413

If we are going up to 300.. Why don't we mark them as slow running test instead?

@dilijev
Copy link
Contributor Author

dilijev commented Aug 11, 2016

@obastemur primarily because there is a lack of consensus on moving them to slow-running tests. We need to fix what's broken. If we decide to move them to slow, we can do that later.

Also even if we mark them as slow, they still need to have a large enough timeout. Slow is just a test run configuration, and doesn't exclude the tests from the timeout.

Also the reason I picked 300 is because it is long enough to solve the problems and it matches the other non-default timeouts set for long-running tests:

git grep "<timeout>" *.xml | grep -v rltimeout
test/Array/rlexe.xml:284:      <timeout>300</timeout>
test/DynamicCode/rlexe.xml:15:      <timeout>300</timeout>
test/Error/rlexe.xml:131:      <timeout>300</timeout>
test/JSON/rlexe.xml:31:      <timeout>300</timeout>
test/Operators/rlexe.xml:139:      <timeout>300</timeout>
test/Optimizer/rlexe.xml:1203:      <timeout>300</timeout>
test/Optimizer/rlexe.xml:1212:      <timeout>300</timeout>
test/Optimizer/rlexe.xml:1221:      <timeout>300</timeout>
test/Optimizer/rlexe.xml:1230:      <timeout>300</timeout>
test/TaggedFloats/rlexe.xml:8:      <timeout>300</timeout>
test/es5/rlexe.xml:142:      <timeout>900</timeout>
test/es6/rlexe.xml:644:      <timeout>300</timeout>
test/inlining/rlexe.xml:43:      <timeout>300</timeout>
test/typedarray/rlexe.xml:210:      <timeout>300</timeout>
test/typedarray/rlexe.xml:217:      <timeout>300</timeout>

@obastemur
Copy link
Collaborator

LGTM

@dilijev
Copy link
Contributor Author

dilijev commented Aug 11, 2016

Thanks @obastemur. I'm merging this in and hopefully this will unblock all of our PRs and other affected build configurations.

@chakrabot chakrabot merged commit b6d7f55 into chakra-core:master Aug 11, 2016
chakrabot pushed a commit that referenced this pull request Aug 11, 2016
…ay_slice.js which can run longer on busy or low-resource test VMs.

Merge pull request #1421 from dilijev:timeout

We've observed them run for around 120 seconds; increasing the timeout to 300 seconds (5 minutes) should more than account for this.

Fixes #1406
Fixes #1417
@dilijev dilijev deleted the timeout branch August 11, 2016 21:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants