Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork7.9k
Fix hatch linewidth in PGF#24263
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
base:main
Are you sure you want to change the base?
Conversation
Not obvious why this kills all Linux tests, but no Windows or Mac tests. |
This looks extremely fragile. After a rebase I get even more errors... Especially, it seems like the conversion from PDF to PNG is affected for some unknown reason. An example: These look quite the same IMHO. |
I cannot run the whole test suite locally in one go as I get a However, I noted this error from the test log:
So it may indeed be related to what was discussed during the dev call: something messes up the ghostscript conversion process that propagates among the tests. |
Those tend to come from binaries compiled with CPU instruction sets that your CPU does not have...and I have mostly seen them with mkl or other "go fast" numpy extensionss.... |
1c7cd9f
to4b2dc96
CompareI'd be happy to finish it. Just that I do not really know what remains to be done and/or what causes the weird inconsistent test images. But I'll try to rebase it and see what the current status is and if anything has changed. |
Uh oh!
There was an error while loading.Please reload this page.
@pytest.mark.backend('pgf') | ||
@image_comparison(['hatch_linewidth'], extensions=['pdf']) | ||
def test_pgf_hatch_linewidth(): | ||
mpl.backend_bases.register_backend('pdf', FigureCanvasPgf) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Considering how many things are failing, this must be a global setting that isn't reset.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Ahh, good observation! That is indeed a good first thing to fix. I'll give it a try.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Not sure this works as expected (cannot run matplotlib from source locally for unknown reasons).
Main problem was how to change the backend back after the test. One may consider adding an argument to theimage_comparison
decorator that takes a lambda and evaluates it after the comparison is done.
I also considered usingcheck_figure_equal
and mocking two classes that has a save_fig method that sets backend, but didn't understand how to deal with theclose
call later on.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Looking at it again, the tests in this file usepytest.mark.backend('pgf')
withimage_comparison(['foo.pdf'])
; do we really need the extra registration or are the other tests not working as expected?
3fa3d67
tod631b01
CompareI think the test is correct now. Added a try-except block to make sure that the backend is always restored. |
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Uh oh!
There was an error while loading.Please reload this page.
Co-authored-by: Antony Lee <anntzer.lee@gmail.com>
Co-authored-by: Antony Leeanntzer.lee@gmail.com
PR Summary
Closes#15491
PR Checklist
Tests and Styling
pytest
passes).flake8-docstrings
and runflake8 --docstring-convention=all
).Documentation
doc/users/next_whats_new/
(follow instructions in README.rst there).doc/api/next_api_changes/
(follow instructions in README.rst there).