Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

BUG: Fix einsum optimize logic for singleton dimensions#10352

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged
charris merged 4 commits intonumpy:masterfromlarsoner:fix-einsum
Feb 5, 2018

Conversation

@larsoner
Copy link
Contributor

Closes#10343.

I don't know if it's the correct fix because these functions have a lot of logic going on that I have never looked at before (and haven't looked at too closely). Feel free to close if the right solution is a bit more involved.

@mhvkmhvk added this to the1.15.0 release milestoneJan 10, 2018
@mhvk
Copy link
Contributor

@larsoner - thanks for the PR. It took me along time to see what the heck is going on in the parsing, but now I understand that, basically, theoptimize path did not include broadcasting -- I must add it surprised me that broadcasting was even allowed on identical indices (rather than just on the..., but that seems to be the case before too).

Anyway, I think this PR is correct.@dgasmith - could you have a look?

@dgasmith
Copy link
Contributor

I need to think on it a bit, that parsing tech is fairly complex as folks want quite the variety of input possibilities. I would consider adding more tests that cover the "double broadcasting" cases where 1-dims are mixed with ellipses like in the original issue.

@dgasmith
Copy link
Contributor

This might hit issues in the BLAS calls. Please try the following two cases:

a=np.random.rand(1,5)b=np.random.rand(5,5)np.einsum("...ij,...jk->...ik",a,a)np.einsum("...ij,...jk->...ik",a,b)

I suspect we may have to go in and change the BLAS logic to bounce broadcasting cases as well.

@larsoner
Copy link
ContributorAuthor

Rebased due to conflict and test added. I'm not on my dev machine now so I can't easily run the new tests so I'll let the CIs do the work :)

@larsoner
Copy link
ContributorAuthor

@dgasmith@mhvk tests are indeed a bit broken now. It might take me a while to fix this, so feel free to take over, or I can find time in the coming weeks to work on it.

@dgasmith
Copy link
Contributor

Yup, ill need to go in and really take a look at whats going on. Ill get to it in the next few days.

@larsonerlarsoner changed the titleFIX: Fix einsum optimize logic for singleton dimensionsMRG: Fix einsum optimize logic for singleton dimensionsJan 18, 2018
@larsoner
Copy link
ContributorAuthor

Okay@dgasmith I deduplicated the code a bit, feel free to look to see if you're happy.

CIs are happy, ready for review/merge from my end.

Copy link
Contributor

@dgasmithdgasmith left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Looks good to me, the NumPy maintainers might have a few convention thoughts.

@eric-wieser
Copy link
Member

Can you loop me in on what theMRG prefix means? I've seen it used a bunch in matplotlib too

left_dims= {dim:sizefordim,sizein
zip(input_left,tmp_operands[0].shape)}
right_dims= {dim:sizefordim,sizein
zip(input_right,tmp_operands[1].shape)}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Why not useidx_rm here? The other entries in this dict are being ignored

Copy link
ContributorAuthor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Hereinput_left is e.g.'ik', so this seems like a straightforward / quick way to get the indices to line up. Are you thinking something like this should be done instead?

if any(tmp_operands[0].shape[input_left.index(ind)] != tmp_operands[1].shape[input_right.index(ind)] for ind in idx_rm)

Copy link
Member

@eric-wiesereric-wieserJan 18, 2018
edited
Loading

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

You're right, what you had was better and probably faster too

@larsoner
Copy link
ContributorAuthor

Can you loop me in on what the MRG prefix means? I've seen it used a bunch in matplotlib too

It's a way for me (contributor) to signal to you (reviewers) that the PR is ready for review / merge from my perspective. In other projects I work on, maintainers historically have then changed the title as they approve to e.g.MRG+1,MRG+2 etc., though GitHub recently came out with the "approve" / "request changes" stuff so now that mechanism gets used more than additional title changes from the reviewer end.

eric-wieser reacted with thumbs up emoji

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

To reiterate my now-hidden comment, I think what you had before was better - sorry!

Copy link
ContributorAuthor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

That's fine, easy enough to remove the commit

@charrischarris changed the titleMRG: Fix einsum optimize logic for singleton dimensionsBUG: Fix einsum optimize logic for singleton dimensionsJan 28, 2018
@charrischarris merged commita113731 intonumpy:masterFeb 5, 2018
@charris
Copy link
Member

Merged, thanks@larsoner .

@larsonerlarsoner deleted the fix-einsum branchFebruary 5, 2018 17:43
@charrischarris removed this from the1.15.0 release milestoneFeb 9, 2018
@charrischarris removed the 09 - Backport-CandidatePRs tagged should be backported labelFeb 9, 2018
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment

Reviewers

@eric-wiesereric-wiesereric-wieser left review comments

+1 more reviewer

@dgasmithdgasmithdgasmith approved these changes

Reviewers whose approvals may not affect merge requirements

Assignees

No one assigned

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

einsum broadcast regression (with optimize=True)

5 participants

@larsoner@mhvk@dgasmith@eric-wieser@charris

[8]ページ先頭

©2009-2025 Movatter.jp