Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork32.4k
gh-116622: Enabletest_doctest
on platforms that don't support subprocesses#116758
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Uh oh!
There was an error while loading.Please reload this page.
Conversation
@sobolevn: You've done some work in this area recently; are you able to review this PR? There's one test failure, but I don't think it's related. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Thanks, I like the idea, no need to skip the whole module just because one test requires thesubprocess
module.
Traceback (most recent call last): | ||
... | ||
FileNotFoundError: [Errno ...] ...nosuchfile... | ||
if support.has_subprocess_support: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
What I don't like is that the diff is huge. Maybe we can do something like
defdoctest_requires_subproccess(func):ifnotsupport.has_subprocess_support:func.__doc__=None# skip the doctest by setting it to `None`returnfunc@doctest_requires_subprocessdeftest_CLI():# as before, no changes ...
I've tried this locally withif True
:
» ./python.exe -m test test_doctest -m test_CLIUsing random seed: 42658885890:00:00 load avg: 2.23 Run 1 test sequentially0:00:00 load avg: 2.23 [1/1] test_doctesttest_doctest ran no tests== Tests result: NO TESTS RAN ==1 test run no tests: test_doctestTotal duration: 59 msTotal tests: run=0 (filtered)Total test files: run=1/1 (filtered) run_no_tests=1Result: NO TESTS RAN
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Good idea; I've generalized it slightly so it can accept any skip condition.
I want to double check our idea, maybe with@serhiy-storchaka ? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
There is one minor problem -- a skipped test just disappears from the output. It is not counted as skipped.
Yes, but I don't think that there's an easy way to fix that. Since the whole module is counted as a single
Only 1 is reported to be skipped, while in reality all doctests were skipped in this module. So, not reporting a single doctest is in line with that. |
It is far from ideal, but we can set a docstring with a skipped doctest instead of None. |
In the current state a skipped doctest still isn't counted as skipped by unittest – it just shows as a passed test, even in the verbose log. So I've made a small change to |
I think that you are fixing this in the wrong PR :) And you can keep this PR focused on just |
serhiy-storchaka left a comment• edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
LGTM.
Although I agree that the change inDocTestCase
deserves a separate issue or at least a separate PR (you can also make it a part of the larger issue#108885). It is larger and more interesting than your initial PR. But anyway, both changes LGTM.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I also think that we can add@doctest_skip
and@doctest_skip_if
decorators to the stdlib. They might be useful for others. This should be a new issue as well :)
Uh oh!
There was an error while loading.Please reload this page.
OK, I've split it out into#117297.
I looked into allowing doctests to use the existing Anyway, I've already got way deeper into this area than I was planning, so I'll leave that to someone else. 😄 |
I will take it from here, thank you for this PR! 👍 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Thanks! Just one comment.
Uh oh!
There was an error while loading.Please reload this page.
Co-authored-by: Nikita Sobolev <mail@sobolevn.me>
@sobolevn: Are you able to merge this PR? |
Yes, sure! Thanks a lot for working on this 👍 |
What do you think about 3.12 backport? |
On 3.12 the only platforms that don't support subprocesses are Emscripten and WASI, so a backport would slightly improve test coverage for them. However, to actually report the test as skipped would require#117297, which shouldn't be backported because it's a behavior change. |
…t subprocesses (python#116758)Co-authored-by: Nikita Sobolev <mail@sobolevn.me>
Uh oh!
There was an error while loading.Please reload this page.
When testing on Android I noticed that test_doctest and test_zipimport_support (which calls test_doctest) were skipping entire modules because of missing subprocess support, even though only one test method actually required subprocesses. This PR makes the skip more selective.