- Notifications
You must be signed in to change notification settings - Fork2.4k
Description
Hello LocalGPT experts, I follow the instruction of installing localGPT on google colab. Several days ago, I found it worked very well. At that time, the document was US constitution pdf file. Today, I installed localGPT again, when I run !python run_localGPT.py, I have an error message: I also noticed that the pdf file becomes Orca_paper.pdf (it should NOT be a problem source).
2023-09-19 23:07:00.039792: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
2023-09-19 23:07:03,169 - INFO - run_localGPT.py:221 - Running on: cuda
2023-09-19 23:07:03,169 - INFO - run_localGPT.py:222 - Display Source Documents set to: False
2023-09-19 23:07:03,169 - INFO - run_localGPT.py:223 - Use history set to: False
2023-09-19 23:07:03,346 - INFO - SentenceTransformer.py:66 - Load pretrained SentenceTransformer: hkunlp/instructor-large
load INSTRUCTOR_Transformer
max_seq_length 512
2023-09-19 23:07:06,973 - INFO - posthog.py:16 - Anonymized telemetry enabled. Seehttps://docs.trychroma.com/telemetry for more information.
2023-09-19 23:07:07,051 - INFO - run_localGPT.py:56 - Loading Model: TheBloke/Llama-2-7b-Chat-GGUF, on: cuda
2023-09-19 23:07:07,051 - INFO - run_localGPT.py:57 - This action can take a few minutes!
2023-09-19 23:07:07,051 - INFO - load_models.py:38 - Using Llamacpp for GGUF/GGML quantized models
Traceback (most recent call last):
File "/content/localGPT/run_localGPT.py", line 258, in
main()
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1157, incall
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.10/dist-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/content/localGPT/run_localGPT.py", line 229, in main
qa = retrieval_qa_pipline(device_type, use_history, promptTemplate_type="llama")
File "/content/localGPT/run_localGPT.py", line 144, in retrieval_qa_pipline
qa = RetrievalQA.from_chain_type(
File "/usr/local/lib/python3.10/dist-packages/langchain/chains/retrieval_qa/base.py", line 100, in from_chain_type
combine_documents_chain = load_qa_chain(
File "/usr/local/lib/python3.10/dist-packages/langchain/chains/question_answering/init.py", line 249, in load_qa_chain
return loader_mapping[chain_type](
File "/usr/local/lib/python3.10/dist-packages/langchain/chains/question_answering/init.py", line 73, in _load_stuff_chain
llm_chain = LLMChain(
File "/usr/local/lib/python3.10/dist-packages/langchain/load/serializable.py", line 74, ininit
super().init(**kwargs)
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for LLMChain
llm
none is not an allowed value (type=type_error.none.not_allowed)