Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Fix kontext finetune issue when batch size >1#11921

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Merged
asomoza merged 8 commits intohuggingface:mainfrommymusise:main
Jul 18, 2025

Conversation

mymusise
Copy link
Contributor

What does this PR do?

Problem

Training fails with shape mismatch when using custom instance prompts andbatch_size > 1 due to partial batches from the dataloader.

Solution

Setdrop_last=True inBucketBatchSampler to ensure consistent batch sizes during training. This prevents shape mismatch errors when the last batch is smaller than the specified batch size.

Testing

Verified the fix resolves the shape mismatch error by running training with custom instance prompts andbatch_size > 1. No shape mismatch occurs after this change.

Fixes # (issue)


Before submitting

  • This PR fixes a bug in the training script.
  • Did you read thecontributor guideline?
  • Did you read ourphilosophy doc?
  • Was this discussed/approved via a GitHub issue or theforum?(N/A if not discussed)
  • Did you make sure to update the documentation with your changes?(N/A for code-only bugfix)
  • Did you write any new necessary tests?(Manual test performed)

Who can review?

Anyone in the community is free to review the PR once the tests have passed.

For this example script and dataloader logic, relevant reviewers could be:

Signed-off-by: mymusise <mymusise1@gmail.com>
@mymusisemymusise changed the titleFix kontext finetune issue where batch size >1Fix kontext finetune issue when batch size >1Jul 14, 2025
@asomoza
Copy link
Member

cc:@linoytsaban

linoytsaban reacted with thumbs up emoji

@HuggingFaceDocBuilderDev

The docs for this PR livehere. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Collaborator

@linoytsabanlinoytsaban left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

thanks@mymusise!
and thanks@asomoza for the tag, I think I had it initially asFalse so to not waste samples with small datasets, but didn't make the needed adjustments to support batches of variant sizes. Better to have it asTrue as this PR suggests

@asomoza
Copy link
Member

thanks! failing test is not related to this PR

@asomozaasomoza merged commitcde02b0 intohuggingface:mainJul 18, 2025
24 of 25 checks passed
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Reviewers

@sayakpaulsayakpaulsayakpaul approved these changes

@linoytsabanlinoytsabanlinoytsaban approved these changes

Assignees
No one assigned
Labels
None yet
Projects
None yet
Milestone
No milestone
Development

Successfully merging this pull request may close these issues.

5 participants
@mymusise@asomoza@HuggingFaceDocBuilderDev@sayakpaul@linoytsaban

[8]ページ先頭

©2009-2025 Movatter.jp