Ollama error 404 #1710
Replies: 1 comment
-
Hi! 👋 We are using other channels as our official means of communication with users. We apologize for the delayed response. Thank you for your understanding. Best regards, |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I don't know how resolve this problem
Ollama is running but langflow return Error 404
ValueError: Error: Ollama call
failed with status code 404. Maybe
your model is not found and you
should pull the model with
ollama pull llama2:latest
.Error in chat websocket: control frame
too long
Beta Was this translation helpful? Give feedback.
All reactions