- Notifications
You must be signed in to change notification settings - Fork1.7k
Insights: mlc-ai/mlc-llm
Overview
Loading
Could not load contribution data
Please try again later
Loading
1 Pull request merged by1 person
- [Fix] Fix gemma 3 conv template stop token
#3191 merged
Mar 27, 2025
2 Pull requests opened by1 person
- [Serving] Add Structural-Tag api to RequestResponseFormat
#3187 opened
Mar 24, 2025 - [Serving] Support tool function calls under strict format constraints
#3190 opened
Mar 26, 2025
2 Issues closed by2 people
- mlc_llm package error[Bug]
#3180 closed
Mar 24, 2025
1 Issue opened by1 person
- [Question] How to evaluate the accuracy of models???
#3188 opened
Mar 24, 2025
6 Unresolved conversations
Sometimes conversations happen on old items that aren’t yet closed. Here is a list of all the Issues and Pull Requests with unresolved conversations.
- [Bug] clang linker error upon running any model in Windows
#3177 commented on
Mar 22, 2025 • 0 new comments - [Feature Request] Support for returning log probabilities of both the prompt and the response tokens in the MLC-LLM API, similar to the functionality provided by OpenAI API.
#2908 commented on
Mar 24, 2025 • 0 new comments - [Bug] Broken for Intel Macs since v0.15 (or earlier)
#3078 commented on
Mar 25, 2025 • 0 new comments - [Bug] Mlc cli server gets stuck
#3145 commented on
Mar 27, 2025 • 0 new comments - Refactored random.h to have PhiloxRandomGenerator
#3181 commented on
Mar 23, 2025 • 0 new comments