#
mixtral-of-experts
Here are 2 public repositories matching this topic...
D^2-MoE: Delta Decompression for MoE-based LLMs Compression
- Updated
Mar 3, 2025 - Python
ACL 2024 (SRW), Official Codebase of our Paper: "MoExtend: Tuning New Experts for Modality and Task Extension"
- Updated
Dec 3, 2024 - Python
Improve this page
Add a description, image, and links to themixtral-of-experts topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with themixtral-of-experts topic, visit your repo's landing page and select "manage topics."