mirror of
https://github.com/dogkeeper886/ollama37.git
synced 2025-12-10 07:46:59 +00:00
update readme with docker setup and link to import.md
This commit is contained in:
20
docs/faq.md
20
docs/faq.md
@@ -1,5 +1,21 @@
|
||||
# FAQ
|
||||
|
||||
## How can I view the logs?
|
||||
|
||||
On macOS:
|
||||
|
||||
```
|
||||
cat ~/.ollama/logs/server.log
|
||||
```
|
||||
|
||||
On Linux:
|
||||
|
||||
```
|
||||
journalctl -u ollama
|
||||
```
|
||||
|
||||
If you're running `ollama serve` directly, the logs will be printed to the console.
|
||||
|
||||
## How can I expose the Ollama server?
|
||||
|
||||
```bash
|
||||
@@ -14,5 +30,5 @@ OLLAMA_ORIGINS=http://192.168.1.1:*,https://example.com ollama serve
|
||||
|
||||
## Where are models stored?
|
||||
|
||||
* macOS: Raw model data is stored under `~/.ollama/models`.
|
||||
* Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
|
||||
- macOS: Raw model data is stored under `~/.ollama/models`.
|
||||
- Linux: Raw model data is stored under `/usr/share/ollama/.ollama/models`
|
||||
|
||||
@@ -23,7 +23,7 @@ git clone https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1
|
||||
cd Mistral-7B-Instruct-v0.1
|
||||
```
|
||||
|
||||
### Step 2: Convert and quantize (PyTorch and Safetensors)
|
||||
### Step 2: Convert and quantize (for PyTorch and Safetensors)
|
||||
|
||||
A [Docker image](https://hub.docker.com/r/ollama/quantize) with the tooling required to convert and quantize models is available.
|
||||
|
||||
|
||||
@@ -80,4 +80,3 @@ To view logs of Ollama running as a startup service, run:
|
||||
```bash
|
||||
journalctl -u ollama
|
||||
```
|
||||
|
||||
|
||||
Reference in New Issue
Block a user