mirror of
https://github.com/dogkeeper886/ollama37.git
synced 2025-12-13 09:17:02 +00:00
@@ -1,11 +1,5 @@
|
||||
# Bash Shell examples
|
||||
|
||||

|
||||
When you review the examples on this site, it is possible to think that making use of AI with Ollama will be hard. You need an orchestrator, and vector database, complicated infrastructure, and more. But that is not always the case. Ollama is designed to be easy to use, and to be used in any environment.
|
||||
|
||||
When calling `ollama`, you can pass it a file to run all the prompts in the file, one after the other. This concept is used in two examples
|
||||
|
||||
## Bulk Questions
|
||||
`bulkquestions.sh` is a script that runs all the questions in `sourcequestions` using the llama2 model and outputs the answers.
|
||||
|
||||
## Compare Models
|
||||
`comparemodels.sh` is a script that runs all the questions in `sourcequestions` using any 4 models you choose that you have already pulled from the Ollama library or have created locally.
|
||||
The two examples here show how to list the models and query them from a simple bash script.
|
||||
|
||||
Reference in New Issue
Block a user