Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

After successfully setting up and debugging the local Ollama model, when trying to make requests using scrapegraphai with the Ollama API, I consistently receive a 502 Bad Gateway error and am unable to get the expected responses. Despite the Ollama service running locally and being tested successfully with curl, the issue persists when calling it via scrapegraphai.#790

44510709 started this conversation inGeneral
Discussion options

Issue Description:
After successfully setting up and debugging the local Ollama model, when trying to make requests using scrapegraphai with the Ollama API, I consistently receive a 502 Bad Gateway error and am unable to get the expected responses. Despite the Ollama service running locally and being tested successfully with curl, the issue persists when calling it via scrapegraphai.

Environment & Versions:
OS: macOS 13.x
Python Version: 3.12
Libraries & Frameworks:
scrapegraphai: for web scraping and generating answers
langchain: for constructing and invoking language models
Ollama Local Models: mistral:latest, nomic-embed-text:latest
Port: 11434 (default API port for Ollama service)
Steps to Reproduce:
Local Model Setup and Start:

Successfully started the Ollama service locally using ollama serve.
Verified Ollama service is running by using curlhttp://localhost:11434/.
Configured scrapegraphai for Web Scraping: Set up the scrapegraphai configuration file to call the local Ollama API and scrape web data.

python
复制代码
from scrapegraphai.graphs import SmartScraperGraph

graph_config = {
"llm": {
"model": "ollama/mistral",
"temperature": 0,
"format": "json", # Ollama requires explicit format
"base_url": "http://localhost:11434", # Ollama API URL
},
"embeddings": {
"model": "ollama/nomic-embed-text",
"base_url": "http://localhost:11434", # Ollama API URL
},
"verbose": True,
}

smart_scraper_graph = SmartScraperGraph(
prompt="List me all the projects with their descriptions",
source="https://perinim.github.io/projects",
config=graph_config
)

result = smart_scraper_graph.run()
print(result)
Encountered Error:

When calling smart_scraper_graph.run(), the request consistently returns a 502 Bad Gateway error:
bash
复制代码
HTTP Request: POSThttp://localhost:11434/api/chat "HTTP/1.1 502 Bad Gateway"
The full debug log shows:
less
复制代码
DEBUG:httpcore.connection:connect_tcp.started host='127.0.0.1' port=7890 local_address=None timeout=None socket_options=None
DEBUG:httpcore.connection:connect_tcp.complete return_value=<httpcore._backends.sync.SyncStream object at 0x1796da6c0>
DEBUG:httpcore.http11:send_request_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_headers.complete
DEBUG:httpcore.http11:send_request_body.started request=<Request [b'POST']>
DEBUG:httpcore.http11:send_request_body.complete
DEBUG:httpcore.http11:receive_response_headers.started request=<Request [b'POST']>
DEBUG:httpcore.http11:receive_response_headers.complete return_value=(b'HTTP/1.1', 502, b'Bad Gateway', [(b'Connection', b'close'), (b'Content-Length', b'0')])
INFO:httpx:HTTP Request: POSThttp://localhost:11434/api/chat "HTTP/1.1 502 Bad Gateway"
Debugging Information:
Ollama Service Status:

Confirmed that Ollama service is running by checking the process with ps aux | grep ollama and by using curlhttp://localhost:11434/ (which returns "Ollama is running").
Attempted to make a curl request tohttp://localhost:11434/api/chat but received 404 Not Found.
Local Network Configuration:

Verified that Ollama is listening on port 11434 by running netstat -an | grep 11434 and it shows that the port is in the LISTEN state.
Also, confirmed the connection to the local service is available via localhost.
Expected Behavior:
The local Ollama service should respond successfully to the scrapegraphai requests, but instead, I receive a 502 Bad Gateway response.
Potential Areas for Help:
Why am I receiving a 502 Bad Gateway despite Ollama running correctly locally?
Any configuration changes needed for scrapegraphai to work with local Ollama models?
Any known issues with the interaction between Ollama and scrapegraphai that could be causing this issue?

You must be logged in to vote

Replies: 2 comments

Comment options

Uploading image.png…

You must be logged in to vote
0 replies
Comment options

Strangely when i'm run on colab there's no problem at all.

You must be logged in to vote
0 replies
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Category
General
Labels
None yet
2 participants
@44510709@jose-cisco

[8]ページ先頭

©2009-2025 Movatter.jp