Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Understand why LLM knowledge seems to be used in chats for questions that should be unanswerable #20

Open
helenCNode opened this issue Nov 20, 2024 · 0 comments

Comments

@helenCNode
Copy link
Collaborator

If you ask "How do you boil an egg" as the first question the response is: "I was not able to answer that question".

If you ask a question which the RAG system can answer THEN ask "How do you boil an egg", SOMETIMES (but not every time), you will get a full response which is obviously using LLM knowledge rather than information from any contexts. Nesta sources will be cited which are obviously not relevant.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant