Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Olmo3 long context training support#125

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to ourterms of service andprivacy statement. We’ll occasionally send you account related emails.

Already on GitHub?Sign in to your account

Open
tyler-romero wants to merge21 commits intomain
base:main
Choose a base branch
Loading
fromtyler/lc
Open

Conversation

@tyler-romero
Copy link
Contributor

@tyler-romerotyler-romero commentedJul 1, 2025
edited
Loading

Support Olmo3 model arch + tools for long context training.

Provides three example configs (that still need to be tweaked with actual lc hyperparameters and data mixes):

  1. 65k CL -> 9.3k TPS/device
  2. 135k CL -> 8.4k TPS/device
  3. 262k CL -> 6.4k TPS/device
  4. 0.5M CL -> 5k TPS/device

For reference our best 8k CL pretraining config runs at 12.9k TPS/device

@tyler-romerotyler-romero marked this pull request as ready for reviewJuly 2, 2025 23:48
Copy link
Member

@soldnisoldni left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

left one comment about dependencies... otherwise LGTM. We probably shouldnt merge into main until swafix in?

tyler-romero reacted with thumbs up emoji
]
all = [
"ai2-olmo-core @ git+https://github.com/allenai/OLMo-core.git@c779ca546cc3194e73e7491aaefcdffbed042c65",
"ai2-olmo-core @ git+https://github.com/allenai/OLMo-core.git@tylerr/olmo3-scripts-swafix-foreachopt",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

@undfined how have you been handling this for olmo3?

@tyler-romero do u think is gonna get merged to main soon?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

if not, we can make a new set op optional dependencies called "olmo3-lc-temp" or something like that.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

We're using the same olmo-core branch in theolmo3-anneals base branch for mid-training

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

ok so we should keep this as a branch for now,@undfined ,right?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others.Learn more.

Yep, that will be easiest in the case we want to merge main into our feature branches.

tyler-romero reacted with thumbs up emoji
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment

Reviewers

@soldnisoldnisoldni left review comments

@undfinedundfinedAwaiting requested review from undfined

@drschwenkdrschwenkAwaiting requested review from drschwenk

@abertsch72abertsch72Awaiting requested review from abertsch72

At least 1 approving review is required to merge this pull request.

Assignees

No one assigned

Labels

None yet

Projects

None yet

Milestone

No milestone

Development

Successfully merging this pull request may close these issues.

4 participants

@tyler-romero@undfined@soldni

[8]ページ先頭

©2009-2025 Movatter.jp