Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
#

ring-attention

Here are 4 public repositories matching this topic...

USP: Unified (a.k.a. Hybrid, 2D) Sequence Parallel Attention for Long Context Transformers Model Training and Inference

  • UpdatedMar 20, 2025
  • Python

InternEvo is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies.

  • UpdatedMar 20, 2025
  • Python

[CVPR 2025] The official CLIP training codebase of Inf-CL: "Breaking the Memory Barrier: Near Infinite Batch Size Scaling for Contrastive Loss". A super memory-efficiency CLIP training scheme.

  • UpdatedJan 16, 2025
  • Python

Packaged Ring Attention with Blockwise Transformers for Near-Infinite Context implemented in Jax + Flax.

  • UpdatedMay 10, 2024
  • Python

Improve this page

Add a description, image, and links to thering-attention topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with thering-attention topic, visit your repo's landing page and select "manage topics."

Learn more


[8]ページ先頭

©2009-2025 Movatter.jp