Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

PYTHON-1752 bulk_write should be able to accept a generator#2262

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Draft
sleepyStick wants to merge13 commits intomongodb:master
base:master
Choose a base branch
Loading
fromsleepyStick:PYTHON-1752

Conversation

sleepyStick
Copy link
Contributor

Notes:

  • I only did this for collection.bulk_write, if we like how it was done, i'd be happy to replicate the change for client.bulk_write, but didn't wanna make the change until I knew we liked how it was done here
  • if a user passes in a generator and doesn't care about ordering, i currently still consume the generator to sort the group the requests by type...the user can pass in a generator and insist on original ordering which wouldn't consume the generator. I was thinking of maybe changing this so that we only consume part of the generator (like maybe up until max_batch_size instead of the whole generator but this seemed to require creating an _AsyncBulk class for each batch which felt...wrong? I was probably going about this incorrectly? Not sure.
  • the test that i picked to duplicate to check generators was a bit arbitrary, but it felt like a good one to pick. If we want more, I'm happy to do so and open to suggestions for which tests specifically.

"""Generate batches of operations, batched by type of
operation, in the order **provided**.
"""
run = None
for idx, (op_type, operation) in enumerate(self.ops):
for idx, request in enumerate(requests):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

I think the goal of this ticket is to avoid inflating the whole generator upfront and only iterate requests as they are needed at the encoding step. For example:

coll.bulk_write((InsertOne({'x':'large'*1024*1024})for_inrange(1_000_000))

If we inflate all at once like we do here, then that code will need to allocate all 1 million documents at once.

Copy link

@CopilotCopilotAI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Copilot reviewed 7 out of 7 changed files in this pull request and generated no comments.

Comments suppressed due to low confidence (2)

pymongo/synchronous/bulk.py:745

  • [nitpick] Consider renaming the 'generator' parameter to 'requests' for clarity, since it represents the source of write operations and to align with the naming in bulk_write.
generator: Generator[_WriteOp[_DocumentType]],

pymongo/asynchronous/bulk.py:747

  • [nitpick] Consider renaming the 'generator' parameter to 'requests' to better reflect its purpose and to maintain consistency with other bulk_write method signatures.
generator: Generator[_WriteOp[_DocumentType]],

Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@ShaneHarveyShaneHarveyShaneHarvey left review comments

Copilot code reviewCopilotCopilot left review comments

At least 1 approving review is required to merge this pull request.

Assignees
No one assigned
Labels
None yet
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

2 participants
@sleepyStick@ShaneHarvey

[8]ページ先頭

©2009-2025 Movatter.jp