How do we handle prompt formatting? #557
Unanswered
NeedsLoomis
asked this question in
Q&A
Replies: 1 comment
-
Hey, I'm currently working on something like this here #787 There's a set of models that llama.cpp supports internally and this PR looks to leverage those templates explicitly. However, it's not exhaustive of all models so you may still need to implement your own HistoryTransform depending on how the model expects the chat prompt |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Seems there are a couple ways to prompt AI's these days, depending on tuning and whatnot, for instance:
or
Do we simply pop them in our prompts? Does it interfere with the AuthorRole setting?
Beta Was this translation helpful? Give feedback.
All reactions