Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

thu-ml

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
@thu-ml

TSAIL group

Tsinghua Statistical Artificial Intelligence & Learning Group

PinnedLoading

  1. zhusuanzhusuanPublic

    A probabilistic programming library for Bayesian deep learning, generative models, based on Tensorflow

    Python 2.2k 419

  2. SageAttentionSageAttentionPublic

    Quantized Attention that achieves speedups of 2.1-3.1x and 2.7-5.1x compared to FlashAttention2 and xformers, respectively, without lossing end-to-end metrics across various models.

    Cuda 1.2k 70

  3. unidiffuserunidiffuserPublic

    Code and models for the paper "One Transformer Fits All Distributions in Multi-Modal Diffusion"

    Python 1.4k 89

  4. prolificdreamerprolificdreamerPublic

    ProlificDreamer: High-Fidelity and Diverse Text-to-3D Generation with Variational Score Distillation (NeurIPS 2023 Spotlight)

    Python 1.5k 45

  5. aresaresPublic

    A Python library for adversarial machine learning focusing on benchmarking adversarial robustness.

    Python 501 88

  6. tianshoutianshouPublic

    An elegant PyTorch deep reinforcement learning library.

    Python 8.3k 1.1k

Repositories

Loading
Type
Select type
Language
Select language
Sort
Select order
Showing 10 of 72 repositories
  • tianshou Public

    An elegant PyTorch deep reinforcement learning library.

    thu-ml/tianshou’s past year of commit activity
    Python 8,297MIT 1,137 145(1 issue needs help) 5 UpdatedMar 17, 2025
  • SpargeAttn Public

    SpargeAttention: A training-free sparse attention that can accelerate any model inference.

    thu-ml/SpargeAttn’s past year of commit activity
    Cuda 309Apache-2.0 15 9 0 UpdatedMar 14, 2025
  • RoboticsDiffusionTransformer Public

    RDT-1B: a Diffusion Foundation Model for Bimanual Manipulation

    thu-ml/RoboticsDiffusionTransformer’s past year of commit activity
    Python 1,001MIT 93 22 3 UpdatedMar 13, 2025
  • DiffusionBridge Public

    Official codebase for "Diffusion Bridge Implicit Models" (ICLR 2025) and "Consistency Diffusion Bridge Models" (NeurIPS 2024)

    thu-ml/DiffusionBridge’s past year of commit activity
    Python 33 3 2 0 UpdatedMar 10, 2025
  • GFT Public
    thu-ml/GFT’s past year of commit activity
    Python 26MIT0 3 0 UpdatedMar 8, 2025
  • i-DODE Public

    Official code for "Improved Techniques for Maximum Likelihood Estimation for Diffusion ODEs" (ICML 2023)

    thu-ml/i-DODE’s past year of commit activity
    Python 17Apache-2.0 1 1 0 UpdatedMar 4, 2025
  • MMTrustEval Public

    A toolbox for benchmarking trustworthiness of multimodal large language models (MultiTrust, NeurIPS 2024 Track Datasets and Benchmarks)

    thu-ml/MMTrustEval’s past year of commit activity
    Python 134CC-BY-SA-4.0 8 3 0 UpdatedMar 4, 2025
  • RIFLEx Public

    Official implementation for "RIFLEx: A Free Lunch for Length Extrapolation in Video Diffusion Transformers"

    thu-ml/RIFLEx’s past year of commit activity
    Python 460Apache-2.0 51 12 0 UpdatedMar 3, 2025
  • TetraJet-MXFP4Training Public

    Pytorch implementation of "Oscillation-Reduced MXFP4 Training for Vision Transformers" on DeiT Model Pre-training

    thu-ml/TetraJet-MXFP4Training’s past year of commit activity
    Python 7Apache-2.0 1 0 0 UpdatedMar 3, 2025
  • SageAttention Public

    Quantized Attention that achieves speedups of 2.1-3.1x and 2.7-5.1x compared to FlashAttention2 and xformers, respectively, without lossing end-to-end metrics across various models.

    thu-ml/SageAttention’s past year of commit activity
    Cuda 1,150Apache-2.0 70 38 1 UpdatedFeb 28, 2025

[8]ページ先頭

©2009-2025 Movatter.jp