Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork7.9k
Tweak coverage#8036
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.
Already on GitHub?Sign in to your account
Tweak coverage#8036
Uh oh!
There was an error while loading.Please reload this page.
Conversation
Oh goody, we had two tests with the same name, so only the second ran (luckily it now succeeds here, and it's only a smoke test). Anyone know a good way to check for this within our test suite--besides checking files with seemingly low test coverage %? |
dopplershift commentedFeb 7, 2017 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
More fun:
|
QuLogic commentedFeb 7, 2017 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
For top-level functions, this is relatively easy:
For class methods, maybe a short Python script: importsysfromcollectionsimportCounterfornameinsys.argv[1:]:methods=Counter()class_name=Nonewithopen(name,'r')asf:forlineinf:ifline.startswith('class'):class_name=line.split()[1].split('(')[0].strip()elifline.startswith('def'):class_name=Noneelifclass_nameandline.startswith(' def'):method_name=line.split()[1].split('(')[0].strip()methods[class_name+'.'+method_name]+=1duplicates= [nameforname,countinmethods.most_common()ifcount!=1]ifduplicates:print(name,duplicates)
|
QuLogic commentedFeb 7, 2017 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
For top-level classes, too:
|
I am 👍 on removing the pickle debugging helpers. I have 0 pity on people depending on helper functions from in our tests (not in |
@@ -158,7 +158,7 @@ def test_contour_manual_labels(): | |||
@image_comparison(baseline_images=['contour_labels_size_color'], | |||
extensions=['png'], remove_text=True) | |||
deftest_contour_manual_labels(): | |||
deftest_contour_labels_size_color(): | |||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Travis is failing on this image comparison. I'm guessing this just needs an updated image, since it's been a while since this test was ever run.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Updated. Surprisingly, that's the only test that had that problem. The other newly activated tests pass, thankfully.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
I'm happy to wait on this until the other is decided upon.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Cool. I think everything else is A+, 👍 so far.
Trying to ensure we don't backslide.
This was causing the first test to not run.
Regenerate test images now that this test is actually run.
This was an exact duplicate of the code above it.
This was deflating our coverage numbers for the tests. If deemednecessary, this code should be resurrected elsewhere (outside tests).Also cleaned up the code a bit.
2655c93
to3227ed1
CompareLack of saving the animation caused it to never run the callbacks.
It's been a good exercise. All of the test files with significant uncovered lines were caused by either dead code or tests that weren't running. I noticed a significant number of uncovered lines left are caused by |
9bc2d92
toaf56c13
CompareRemove unused code and restructure to eliminate uncovered lines.
Since we're here making changes anyways.
dopplershift commentedFeb 7, 2017 • edited
Loading Uh oh!
There was an error while loading.Please reload this page.
edited
Uh oh!
There was an error while loading.Please reload this page.
Another question: Currently, our |
I think both solutions are fine, with a slight preference in keeping it. |
@@ -1464,7 +1488,8 @@ def _as_mpl_axes(self): | |||
ax_via_gca = plt.gca(projection=prj2) | |||
assert ax_via_gca is not ax | |||
assert ax.get_theta_offset() == 0, ax.get_theta_offset() | |||
assert ax_via_gca.get_theta_offset() == np.pi, ax_via_gca.get_theta_offset() | |||
assert ax_via_gca.get_theta_offset() == np.pi, \ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
do we need the second part of the assert with pytest?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Oh, I guess not.
assert xax._major_tick_kw['tick2On'] == True | ||
assert xax._minor_tick_kw['tick1On'] == False | ||
assert xax._minor_tick_kw['tick2On'] == True | ||
assert not xax._major_tick_kw['tick1On'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
These are slightly different tests asassert 'aardvark'
passes, butassert 'aardvark' == True
does not.
Do we care about this change?
Anyone have opinions on whether we should be ignoring
|
I don't think we should skip |
I think a lot of these are stubs in interfaces in the tests. I'm leaning towards turning them into some variety of |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others.Learn more.
Looks good.
re. the new test image, the image would/should have been regenerated a while ago, so adding it even if it'll be changed soon seems fine to me.
Mainly to turn on coverage for all testing on AppVeyor, as well as tighten our expected threshold for the test lines.
WIP because I want to investigate some of the test files that aren't at 100%.