-
Notifications
You must be signed in to change notification settings - Fork 559
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generator cancelled - httpx.ReadTimeout #56
Comments
excute[ ollama run llama3.1:8b-instruct-fp16 ]to warm in case of the time first time to load too long |
Thank you!! It works now! |
User> Hello The above exception was the direct cause of the following exception: Traceback (most recent call last): |
Hi! After I ran
python3 /Users/friedahuang/Documents/eon/llama-agentic-system/examples/scripts/hello.py localhost 5001 --disable_safety
, I encountered the following error. Any ideas?For context, I'm using Apple M2 Pro
The text was updated successfully, but these errors were encountered: