Replies: 1 comment 2 replies
-
Although my setup is a bit different, I have the same question about consistent IDs. I too can pick up the user message id by assigning useChat({
sendExtraMessageFields: true, // Necessary to send message id
}) However, I'm not able to pick up the assistant id on the server, as it looks like it gets generated on the client. My setup: Node + express: const result = await streamText({
model,
messages,,
onFinish: async event => {
// I can't pick up the assistant ID here b/c it looks like it gets generated on the client and never gets sent to the server
await onFinish(event.text);
}
});
result.pipeDataStreamToResponse(res, {
sendUsage: false,
}); |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I’m using the useChat hook from the AI SDK in a Next.js front end with a FastAPI back end, and I generate message IDs using uuidv4. I’m utilizing streamResponse for handling responses.
Currently, for messages with role: "user", I can successfully use the IDs generated on the client side and pass them to FastAPI, where they are processed correctly. However, I’m unsure how to handle the message IDs for role: "assistant" responses generated by the server.
Specifically, I have the following questions:
Is it possible to generate the ID on the client side and pass it to FastAPI for consistent use?
How would I handle ID generation on the server side and have it work seamlessly with useChat?
I’m looking for guidance on which approach is generally recommended and how to implement it effectively with streamResponse.
Beta Was this translation helpful? Give feedback.
All reactions