Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

404 NOT Found! #59

Open
chuznhiwu opened this issue Sep 4, 2024 · 4 comments
Open

404 NOT Found! #59

chuznhiwu opened this issue Sep 4, 2024 · 4 comments

Comments

@chuznhiwu
Copy link

after "llama distribution start --name ollama --port 5000 --disabled ipv6"
then I get
Serving POST /inference/batch_chat_completion
Serving POST /inference/batch_completion
Serving POST /inference/chat_completion
Serving POST /inference/completion
Serving POST /safety/run_shields
Serving POST /agentic_system/memory_bank/attach
Serving POST /agentic_system/create
Serving POST /agentic_system/session/create
Serving POST /agentic_system/turn/create
Serving POST /agentic_system/delete
Serving POST /agentic_system/session/delete
Serving POST /agentic_system/memory_bank/detach
Serving POST /agentic_system/session/get
Serving POST /agentic_system/step/get
Serving POST /agentic_system/turn/get
Listening on 0.0.0.0:5000
INFO: Started server process [293531]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:5000 (Press CTRL+C to quit)
INFO: 127.0.0.1:47696 - "GET /?vscodeBrowserReqId=1725413273912 HTTP/1.1" 404 Not Found

without "--disabled ipv6" is the same output
Has anyone encountered this situation?

@dltn
Copy link
Contributor

dltn commented Sep 4, 2024

The server running on 5000 is an API server and not a webserver you'd be able to visit in your browser. In a separate window, run mesop app/main.py to start the web app.

@chuznhiwu
Copy link
Author

The server running on 5000 is an API server and not a webserver you'd be able to visit in your browser. In a separate window, run mesop app/main.py to start the web app.

Thank you for your reply. I have reloaded llama stack apps and gone through the process again. Below is the result (I did not enable oallma and used local, llama stack run local --name 8b-instruct --port 5000)
image
In a separate window, run mesop app/main.py
image
image
when I run PYTHONPATH=. python examples/scripts/vacation.py localhost 5000, and got this
image
image
I have been feeling confused for several days now

ps:llama stack run local-ollama --name 8b-instruct --port 5000 got the same

@JoseGuilherme1904
Copy link

#62

@chuznhiwu
Copy link
Author

#62

Thank you~

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants