#
distributed-llm
Here are 2 public repositories matching this topic...
Language:All
Filter by language
Distributed LLM inference. Connect home devices into a powerful cluster to accelerate LLM inference. More devices means faster inference.
- Updated
Nov 2, 2025 - C++
一個基於 llama.cpp 的分佈式 LLM 推理程式,讓您能夠利用區域網路內的多台電腦協同進行大型語言模型的分佈式推理,使用 Electron 的製作跨平台桌面應用程式操作 UI。
- Updated
Aug 24, 2025 - JavaScript
Improve this page
Add a description, image, and links to thedistributed-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with thedistributed-llm topic, visit your repo's landing page and select "manage topics."