- Notifications
You must be signed in to change notification settings - Fork2.4k
Closed as not planned

Description
I'm trying to use the following as the model id and base name
MODEL_ID = "TheBloke/Mistral-7B-Instruct-v0.1-GPTQ"
MODEL_BASENAME = "wizardLM-7B-GPTQ-4bit.compat.no-act-order.safetensors"
But when runing run_localgpt.py i get the following error
\miniconda3\Lib\site-packages\auto_gptq\modeling_utils.py", line 147, in check_and_get_model_type
raise TypeError(f"{config.model_type} isn't supported yet.")
TypeError: mistral isn't supported yet.
Any help is super appreciated!!
Metadata
Metadata
Assignees
Labels
No labels