Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

MAINT: Unify calculation of normal vectors from polygons#12136

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged
timhoffm merged 2 commits intomatplotlib:masterfromeric-wieser:voxels-shading
Dec 6, 2018

Conversation

@eric-wieser
Copy link
Contributor

@eric-wiesereric-wieser commentedSep 16, 2018
edited
Loading

This combinesget_normals and_generate_normals, and eliminates all other calls to np.cross. This came up in#9990, but I wanted to postpone it to (this) later PR.

get_normals and_generate_normals were profiled, and it was found that vectorizingnp.cross like inget_normals was faster:

importnumpyasnpdefget_normals(polygons):v1=np.empty((len(polygons),3))v2=np.empty((len(polygons),3))forpoly_i,psinenumerate(polygons):# pick three points around the polygon at which to find the# normal doesn't vectorize because polygons is jaggedi1,i2,i3=0,len(ps)//3,2*len(ps)//3v1[poly_i, :]=ps[i1, :]-ps[i2, :]v2[poly_i, :]=ps[i2, :]-ps[i3, :]returnnp.cross(v1,v2)def_generate_normals(polygons):normals= []forvertsinpolygons:v1=np.array(verts[0])-np.array(verts[1])v2=np.array(verts[2])-np.array(verts[0])normals.append(np.cross(v1,v2))returnnp.array(normals)def_generate_normals_new(polygons):ifisinstance(polygons,np.ndarray):# optimization: polygons all have the same number of points, so can# vectorizen=polygons.shape[-2]i1,i2,i3=0,n//3,2*n//3v1=polygons[...,i1, :]-polygons[...,i2, :]v2=polygons[...,i2, :]-polygons[...,i3, :]else:# The subtraction doesn't vectorize because polygons is jagged.v1=np.empty((len(polygons),3))v2=np.empty((len(polygons),3))forpoly_i,psinenumerate(polygons):n=len(ps)i1,i2,i3=0,n//3,2*n//3v1[poly_i, :]=ps[i1, :]-ps[i2, :]v2[poly_i, :]=ps[i2, :]-ps[i3, :]returnnp.cross(v1,v2)polygons_jagged= [np.random.rand(np.random.randint(10,1000),3)foriinrange(100)]polygons_uniform=np.random.rand(100,100,3)
In [10]: %timeit _generate_normals(polygons_jagged)8.43 ms ± 148 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)In [11]: %timeit get_normals(polygons_jagged)583 µs ± 12.9 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)In [12]: %timeit _generate_normals_new(polygons_jagged)593 µs ± 26 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)In [14]: %timeit _generate_normals(polygons_uniform)8.45 ms ± 156 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)In [15]: %timeit get_normals(polygons_uniform)599 µs ± 6.13 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)In [13]: %timeit _generate_normals_new(polygons_uniform)110 µs ± 10.1 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)

PR Checklist

Do not apply, since this is just a refactor:

  • Has Pytest style unit tests
  • New features are documented, with examples if plot related
  • Documentation is sphinx and numpydoc compliant
  • Added an entry to doc/users/next_whats_new/ if major new feature (follow instructions in README.rst there)
  • Documented in doc/api/api_changes.rst if API changed in a backward-incompatible way

@eric-wieser
Copy link
ContributorAuthor

eric-wieser commentedSep 16, 2018
edited
Loading

Branch name hints that my goal here is to add shading toax3d.voxels, in a later PR

@eric-wieser
Copy link
ContributorAuthor

An alternate implementation with less duplication, but with less clarity, could look like:

def_inplane_vectors(polygons,v1_out=None,v2_out=None):n=polygons.shape[-2]i1,i2,i3=0,n//3,2*n//3v1=np.subtract(ps[...,i1, :],ps[...,i2, :],out=v1_out)v2=np.subtract(ps[...,i2, :],ps[...,i3, :],out=v2_out)returnv1,v2ifisinstance(polygons,np.ndarray):# optimization: polygons all have the same number of points, so can# vectorizev1,v2=_inplane_vectors(polygons)else:# The subtraction doesn't vectorize because polygons is jagged.v1=np.empty((len(polygons),3))v2=np.empty((len(polygons),3))forpoly_i,psinenumerate(polygons):_inplane_vectors(ps,v1_out=v1[poly_i, :],v2_out=v2[poly_i, :])returnnp.cross(v1,v2)

Let me know which is preferable

@eric-wieser
Copy link
ContributorAuthor

eric-wieser commentedSep 17, 2018
edited
Loading

This fails because it changes the sense of the polygons. It seems that some functions use a right-hand-rule for the "front" face of a polygon, while others use a left-hand rule. Related:#12138

@eric-wieser
Copy link
ContributorAuthor

Just to give an update - this is now waiting on#12259

This combines `get_normals` and `_generate_normals`, and eliminates all other calls to np.cross.`get_normals` and `_generate_normals` were profiled, and it was found that vectorizing `np.cross`  like in `get_normals` was faster:```pythonimport numpy as npdef get_normals(polygons):    v1 = np.empty((len(polygons), 3))    v2 = np.empty((len(polygons), 3))    for poly_i, ps in enumerate(polygons):        # pick three points around the polygon at which to find the        # normal doesn't vectorize because polygons is jagged        i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3        v1[poly_i, :] = ps[i1, :] - ps[i2, :]        v2[poly_i, :] = ps[i2, :] - ps[i3, :]    return np.cross(v1, v2)def _generate_normals(self, polygons):    normals = []    for verts in polygons:        v1 = np.array(verts[0]) - np.array(verts[1])        v2 = np.array(verts[2]) - np.array(verts[0])        normals.append(np.cross(v1, v2))    return np.array(normals)polygons = [    np.random.rand(np.random.randint(10, 1000), 3)    for i in range(100)]%timeit _generate_normals(polygons)# 3.14 ms ± 255 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)%timeit get_normals(polygons)# 452 µs ± 4.33 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)```
@eric-wieser
Copy link
ContributorAuthor

Rebased and ready to go

Copy link
Member

@timhoffmtimhoffm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Looks good otherwise. Please rebase.

Parameters
----------
polygons: list of (M_i, 3) array_like, or (..., M, 3) array_like
Copy link
Member

@timhoffmtimhoffmNov 27, 2018
edited
Loading

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Why the i index ?

Copy link
ContributorAuthor

@eric-wiesereric-wieserNov 27, 2018
edited
Loading

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Because the value of M can change between each list item -polygons[0] has shape(M_0, 3),polygons[1] has shape(M_1, 3), and it is likely thatM_1 != M_0

timhoffm reacted with thumbs up emoji
@eric-wieser
Copy link
ContributorAuthor

@timhoffm: merged using the web UI, since it was doc-only. Feel free to squash to elide the merge commit

n=polygons.shape[-2]
i1,i2,i3=0,n//3,2*n//3
v1=polygons[...,i1, :]-polygons[...,i2, :]
v2=polygons[...,i2, :]-polygons[...,i3, :]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Is there the sign change here intended? This is the convention of get_normals. _generate_normals had it the other way round. I know that there have been some orientation issues. But I don‘t know the state. Just want to make sure the sign change is notbslipping in unintendedly.

Copy link
ContributorAuthor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Sign change is only inv1 andv2, it cancels out in the cross product, so it doesn't affect the return value.

I picked the convention fromget_normals because that was easiest. If you want me to flip both subtractions here, I can, but it won't make any difference.

timhoffm reacted with thumbs up emoji
@timhoffmtimhoffm merged commit3e07fd4 intomatplotlib:masterDec 6, 2018
@QuLogicQuLogic added this to thev3.1 milestoneDec 6, 2018
Stephen-Chilcote pushed a commit to Stephen-Chilcote/matplotlib that referenced this pull requestDec 11, 2018
…12136)This combines `get_normals` and `_generate_normals`, and eliminates all other calls to np.cross.`get_normals` and `_generate_normals` were profiled, and it was found that vectorizing `np.cross`  like in `get_normals` was faster:```pythonimport numpy as npdef get_normals(polygons):    v1 = np.empty((len(polygons), 3))    v2 = np.empty((len(polygons), 3))    for poly_i, ps in enumerate(polygons):        # pick three points around the polygon at which to find the        # normal doesn't vectorize because polygons is jagged        i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3        v1[poly_i, :] = ps[i1, :] - ps[i2, :]        v2[poly_i, :] = ps[i2, :] - ps[i3, :]    return np.cross(v1, v2)def _generate_normals(self, polygons):    normals = []    for verts in polygons:        v1 = np.array(verts[0]) - np.array(verts[1])        v2 = np.array(verts[2]) - np.array(verts[0])        normals.append(np.cross(v1, v2))    return np.array(normals)polygons = [    np.random.rand(np.random.randint(10, 1000), 3)    for i in range(100)]%timeit _generate_normals(polygons)# 3.14 ms ± 255 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)%timeit get_normals(polygons)# 452 µs ± 4.33 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)```
raamana added a commit to raamana/matplotlib that referenced this pull requestDec 13, 2018
* upstream/master: (1723 commits)  Correctly get weight & style hints from certain newer Microsoft fonts (matplotlib#12945)  Remove some checks for Py<3.6 in the test suite. (matplotlib#12974)  Fail-fast when trying to run tests with too-old pytest.  Include scatter plots in Qt figure options editor. (matplotlib#12779)  ENH: replace deprecated numpy header  Minor simplifications.  tickminorvisible-fix (matplotlib#12938)  Remove animated=True from animation docs  Update the documentation of Cursor  Misc. cleanups.  Add test for 3d conversion of empty PolyCollection  Support ~ as nonbreaking space in mathtext.  Deprecate public use of Formatter.pprint_val.  MAINT: Unify calculation of normal vectors from polygons (matplotlib#12136)  Fix the title of testing_api  More table documentation  Simplify bachelors degree example using new features.  Avoid pyplot in showcase examples.  Simplify argument checking in Table.__getitem__. (matplotlib#12932)  Minor updates following bump to Py3.6+.  ...# Conflicts:#lib/matplotlib/widgets.py
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment

Reviewers

@anntzeranntzeranntzer approved these changes

@timhoffmtimhoffmtimhoffm approved these changes

Assignees

No one assigned

Projects

None yet

Milestone

v3.1.0

Development

Successfully merging this pull request may close these issues.

5 participants

@eric-wieser@anntzer@timhoffm@QuLogic@dstansby

[8]ページ先頭

©2009-2025 Movatter.jp