- Notifications
You must be signed in to change notification settings - Fork252
A WebUI for ChatGLM-6B
NotificationsYou must be signed in to change notification settings
Akegarasu/ChatGLM-webui
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
A webui for ChatGLM made by THUDM.chatglm-6b
- Original Chat likechatglm-6b's demo, but use Gradio Chatbox for better user experience.
- One click install script (but you still must install python)
- More parameters that can be freely adjusted
- Convenient save/load dialog history, presets
- Custom maximum context length
- Save to Markdown
- Use program arguments to specify model and caculation accuracy
python3.10
pip install torch==1.13.1+cu117 torchvision==0.14.1+cu117 --extra-index-url https://download.pytorch.org/whl/cu117pip install --upgrade -r requirements.txt
or
bash install.sh
python webui.py
--model-path
: specify model path. If this parameter is not specified manually, the default value isTHUDM/chatglm-6b
. Transformers will automatically download model from huggingface.
--listen
: launch gradio with 0.0.0.0 as server name, allowing to respond to network requests
--port
: webui port
--share
: use gradio to share
--precision
: fp32(CPU only), fp16, int4(CUDA GPU only), int8(CUDA GPU only)
--cpu
: use cpu
--path-prefix
: url root path. If this parameter is not specified manually, the default value is/
. Using a path prefix of/foo/bar
enables ChatGLM-webui to serve fromhttp://$ip:$port/foo/bar/
rather thanhttp://$ip:$port/
.