Skip to content

Commit

Permalink
Make the llm prompt stricter (#505)
Browse files Browse the repository at this point in the history
This is to avoid complex SQL queries
  • Loading branch information
aldrinjenson authored Sep 15, 2023
1 parent dcc83e1 commit 36a247c
Showing 1 changed file with 3 additions and 5 deletions.
8 changes: 3 additions & 5 deletions server/src/handlers/http/llm.rs
Original file line number Diff line number Diff line change
Expand Up @@ -72,10 +72,8 @@ fn build_prompt(stream: &str, prompt: &str, schema_json: &str) -> String {
format!(
r#"I have a table called {}.
It has the columns:\n{}
Based on this, generate valid SQL for the query: "{}"
Generate only SQL as output. Also add comments in SQL syntax to explain your actions.
Don't output anything else.
If it is not possible to generate valid SQL, output an SQL comment saying so."#,
Based on this schema, generate valid SQL for the query: "{}"
Generate only simple SQL as output. Also add comments in SQL syntax to explain your actions. Don't output anything else. If it is not possible to generate valid SQL, output an SQL comment saying so."#,
stream, schema_json, prompt
)
}
Expand All @@ -84,7 +82,7 @@ fn build_request_body(ai_prompt: String) -> impl serde::Serialize {
json!({
"model": "gpt-3.5-turbo",
"messages": [{ "role": "user", "content": ai_prompt}],
"temperature": 0.6,
"temperature": 0.7,
})
}

Expand Down

0 comments on commit 36a247c

Please sign in to comment.