treat stop as stop sequences, not exact tokens (#442)

The `stop` option to the generate API is a list of sequences that should cause generation to stop. Although these are commonly called "stop tokens", they do not necessarily correspond to LLM tokens (per the LLM's tokenizer). For example, if the caller sends a generate request with `"stop":["\n"]`, then generation should stop on any token containing `\n` (and trim `\n` from the output), not just if the token exactly matches `\n`. If `stop` were interpreted strictly as LLM tokens, then it would require callers of the generate API to know the LLM's tokenizer and enumerate many tokens in the `stop` list.

Fixes https://github.com/jmorganca/ollama/issues/295.
This commit is contained in:
Quinn Slack
2023-08-30 10:53:42 -05:00
committed by GitHub
parent 982c535428
commit f4432e1dba
4 changed files with 109 additions and 17 deletions

View File

@@ -430,7 +430,7 @@ func CreateModel(ctx context.Context, name string, path string, fn func(resp api
layer.MediaType = mediaType
layers = append(layers, layer)
default:
// runtime parameters, build a list of args for each parameter to allow multiple values to be specified (ex: multiple stop tokens)
// runtime parameters, build a list of args for each parameter to allow multiple values to be specified (ex: multiple stop sequences)
params[c.Name] = append(params[c.Name], c.Args)
}
}