Uh oh!
There was an error while loading.Please reload this page.
- Notifications
You must be signed in to change notification settings - Fork32k
Open
Description
Crash report
What happened?
I was testing some AI code with ollama and stumbled across a really weird crash.
The fact that it happens during an IndexError and a specific function has to be there leads me to believe that this is a CPython bug and not an ollama bug.
fromollamaimportchat# Has to be here to segfault???defcolorSwitch(color):print(color,end="",flush=True)stream=chat(model="llama3.2",# I think it works with any but this is what I used.messages=[{"role":"user","content":""}],options={"seed":0},# Does not need this but I figured it would be helpfulstream=True,)# Any iteration works. I just simplified it down to this.part=next(iter(stream))['message']['content']temp=part.split("</think>",1)# Crash Heretemp[1]
CPython versions tested on:
3.13
Operating systems tested on:
Linux
Output from running 'python -VV' on the command line:
Python 3.13.2 (main, Feb 5 2025, 08:05:21) [GCC 14.2.1 20250128]
Linked PRs
Metadata
Metadata
Assignees
Labels
Projects
Status
No status