Replies: 2 comments
-
I have the same problem. |
Beta Was this translation helpful? Give feedback.
0 replies
-
In the project directory, ./logs/llm sub-directory, there are logs of the prompts and responses as sent to/from the LLM. Can you take a look and see how it responds to your task, at the step where you expect ipython or other tabs? Attach or paste here if you can. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Through this documentation >> https://github.com/OpenDevin/OpenDevin/blob/main/Development.md
I configured the project to run on local llama model by changing config.toml by make setup-config
The port is up and running but only the text chat area is able to respond and i see no changes in my workspace like code editor browser or jupyter ipython.
Model used: ollama/codellama also did it with moondream and llama3 but same problem persists.
Beta Was this translation helpful? Give feedback.
All reactions