-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement conversation API endpoint #188
base: master
Are you sure you want to change the base?
Conversation
server/api_conversation.go
Outdated
return | ||
} | ||
|
||
bot := p.GetBotMentioned(post.Message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am keeping the message requirements intact, such as needing to mention a specific bot.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should take the bot name as a parameter instead of requiring a mention in the post?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, given #188 (comment), I think it makes sense to make this a bit more structured. It may also be cleaner to decouple the current request/prompt from any past context.
@@ -26,6 +26,8 @@ func (p *Plugin) ServeHTTP(c *plugin.Context, w http.ResponseWriter, r *http.Req | |||
router.GET("/ai_threads", p.handleGetAIThreads) | |||
router.GET("/ai_bots", p.handleGetAIBots) | |||
|
|||
router.POST("/conversation", p.handlePostConversation) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not overly worried about naming or scoping. Just started with this to get something working.
// Flushing lets us stream partial results without requiring the client to wait for the full response. | ||
c.Writer.Flush() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For streaming to work correctly we'll need mattermost/mattermost@51f4271.
server/api_conversation.go
Outdated
return | ||
} | ||
|
||
var conv []*model.Post |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought a simple slice would be clean enough, but I'm happy to consider a more advanced approach, such as passing some sort of custom object that wraps the data we need.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we want to just wrap that in a struct so we can introduce other parameters if need be without a breaking change.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep, that's probably a healthier choice in the long run. Will update 👍
@crspeller Not urgent. I'd love to get some feedback. This is the minimum addition I could think of that would meet the basic requirements. Happy to review any of it. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect you will probably want a way to manipulate the system message using this API. Instructions to the bot are more powerful from the system message.
func (p *Plugin) handlePostConversation(c *gin.Context) { | ||
userID := c.GetHeader("Mattermost-User-Id") | ||
|
||
// We only allow bots to use this API handler for the time being. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If you are using the inter-plugin API to use this API we could restrict it to other plugins for now with the header: https://github.com/mattermost/mattermost-plugin-github/blob/5aa2450cc65254054872fc3ca6917b1c9354d543/server/plugin/api.go#L265
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks. I'll look into it although that would probably mean an additional roundtrip in my case (i.e. go through the plugin) since the bot is just a client running outside of Mattermost.
server/api_conversation.go
Outdated
return | ||
} | ||
|
||
var conv []*model.Post |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we want to just wrap that in a struct so we can introduce other parameters if need be without a breaking change.
server/api_conversation.go
Outdated
return | ||
} | ||
|
||
bot := p.GetBotMentioned(post.Message) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe we should take the bot name as a parameter instead of requiring a mention in the post?
@crspeller Slowly getting back to this. Are you referring to the prompt used? |
@streamer45 Yes having a way to specify the system prompt |
@crspeller Updated. Please give it another pass whenever you get a chance. |
prompt.AppendConversation(p.ThreadToBotConversation(bot, threadData.Posts)) | ||
|
||
// Overriding post role if requested. | ||
if reqData.UseSystemRole { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sorry I think I was unclear about what this means. The system prompt is special instructions to the LLM that are weighted more highly then user instructions. Usually there is only one system prompt. You can look at OpenAI's explanation of what it means here: https://platform.openai.com/docs/guides/text-generation/chat-completions-api
I guess what I really want is to be able to call this API from other plugins and be able to customize what prompt is provided on line 122. So that we can provide some custom instructions to the LLM about how the rest of the conversation should be handled like telling it it's in a call and talking live, the participants, etc.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@crspeller Are you thinking to go as far as letting plugins pass/override a whole prompt template or more like letting them choose which prompt to use and possibly augment existing ones?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am imagining them passing the whole prompt template. That way they can do whatever they want. Ideally they would be able to utilize the default personalities and whatnot as well.
Description
PR implements a bots-only
/conversation
API endpoint to allow other plugins to programmatically start and continue conversations with Copilot without resorting to hacks such as posting in DMs between bots.The general idea is to leverage the AI plugin as much as possible while eliminating the requirement of a real thread existing in a channel.
The proposed handler expects a list of posts (i.e., a virtual thread) that will be fed as context. The last post in the list will be the current request.
Example
curl -v -H 'Authorization: Bearer BOT_AUTH_TOKEN' http://localhost:8065/plugins/mattermost-ai/conversation --data @payload.json