mirror of
https://github.com/dogkeeper886/ollama37.git
synced 2025-12-11 16:26:59 +00:00
update the readme as per bruce
Signed-off-by: Matt Williams <m@technovangelist.com>
This commit is contained in:
@@ -2,6 +2,14 @@
|
|||||||
|
|
||||||
The **chat** endpoint is one of two ways to generate text from an LLM with Ollama. At a high level you provide the endpoint an array of message objects with a role and content specified. Then with each output and prompt, you add more messages, which builds up the history.
|
The **chat** endpoint is one of two ways to generate text from an LLM with Ollama. At a high level you provide the endpoint an array of message objects with a role and content specified. Then with each output and prompt, you add more messages, which builds up the history.
|
||||||
|
|
||||||
|
## Run the Example
|
||||||
|
|
||||||
|
There are a few ways to run this, just like any Typescript code:
|
||||||
|
|
||||||
|
1. Compile with `tsc` and then run it with `node client.js`.
|
||||||
|
2. Install `tsx` and run it with `tsx client.ts`.
|
||||||
|
3. Install `bun` and run it with `bun client.ts`.
|
||||||
|
|
||||||
## Review the Code
|
## Review the Code
|
||||||
|
|
||||||
You can see in the **chat** function that is actually calling the endpoint is simply done with:
|
You can see in the **chat** function that is actually calling the endpoint is simply done with:
|
||||||
|
|||||||
Reference in New Issue
Block a user