llama-server: chat_format param #13579
Unanswered
taha-yassine
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Looking at the code from https://github.com/ggml-org/llama.cpp/blob/master/tools/server/server.cpp it seems there's a
chat_format
params that defaults toCOMMON_CHAT_FORMAT_CONTENT_ONLY
when not set. I couldn't find any reference to this in the docs, so I was wondering how it's supposed to be used.@ochafik I think you're the one that introduced it so maybe you know the answer?
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions