always use ollama binary

This commit is contained in:
Jeffrey Morgan
2023-07-06 14:32:48 -04:00
parent 7cf5905063
commit 39f4d8edaa
3 changed files with 27 additions and 4 deletions

View File

@@ -6,10 +6,10 @@ This app builds upon Ollama to provide a desktop experience for running models.
## Developing
In the background run the ollama server `ollama.py`:
First, build the `ollama` binary:
```
poetry -C .. run ollama serve
make -C ..
```
Then run the desktop app with `npm start`: