xlite-dev
🛠 Repositories: lite.ai.toolkit | 📚Awesome-LLM-Inference | 📚LeetCUDA 🎧
🤖 ffpa-attn | 📈HGEMM | 🤗flux-faster | 📚Awesome-DiT-Inference 🖱
⚙️ RVM-Inference | lihang-notes(📚PDF, 200 Pages) | 💎torchlm 🔥
🤖 Contact: qyjdef@163.com | GitHub: DefTruth | 知乎: DefTruth 📞
PinnedLoading
- lite.ai.toolkit
lite.ai.toolkit Public🛠A lite C++ AI toolkit: 100+ models with MNN, ORT and TRT, including Det, Seg, Stable-Diffusion, Face-Fusion, etc.🎉
- Awesome-LLM-Inference
Awesome-LLM-Inference Public📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
- Awesome-DiT-Inference
Awesome-DiT-Inference Public📚A curated list of Awesome Diffusion Inference Papers with Codes: Sampling, Cache, Quantization, Parallelism, etc.🎉
Repositories
- lite.ai.toolkit Public
🛠A lite C++ AI toolkit: 100+ models with MNN, ORT and TRT, including Det, Seg, Stable-Diffusion, Face-Fusion, etc.🎉
Uh oh!
There was an error while loading.Please reload this page.
xlite-dev/lite.ai.toolkit’s past year of commit activity - diffusers Public Forked fromhuggingface/diffusers
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.
xlite-dev/diffusers’s past year of commit activity - sglang Public Forked fromsgl-project/sglang
SGLang is a fast serving framework for large language models and vision language models.
xlite-dev/sglang’s past year of commit activity - vllm-omni Public Forked fromvllm-project/vllm-omni
A framework for efficient model inference with omni-modality models
xlite-dev/vllm-omni’s past year of commit activity - SageAttention Public Forked fromthu-ml/SageAttention
Quantized Attention that achieves speedups of 2.1-3.1x and 2.7-5.1x compared to FlashAttention2 and xformers, respectively, without lossing end-to-end metrics across various models.
xlite-dev/SageAttention’s past year of commit activity - Awesome-LLM-Inference Public
📚A curated list of Awesome LLM/VLM Inference Papers with Codes: Flash-Attention, Paged-Attention, WINT8/4, Parallelism, etc.🎉
xlite-dev/Awesome-LLM-Inference’s past year of commit activity - Awesome-DiT-Inference Public
📚A curated list of Awesome Diffusion Inference Papers with Codes: Sampling, Cache, Quantization, Parallelism, etc.🎉
xlite-dev/Awesome-DiT-Inference’s past year of commit activity - .github Public
Uh oh!
There was an error while loading.Please reload this page.
xlite-dev/.github’s past year of commit activity
