-
Notifications
You must be signed in to change notification settings - Fork 200
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OPEA Triage Tool for ChatQnA #1185
base: main
Are you sure you want to change the base?
Conversation
4f82faa
to
c90eb1b
Compare
a sample report zip file |
[ToDo] : understand whether we need debug RESTful API request here or using Telemetery |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Remove any commented out code
Signed-off-by: Tsai, Louie <[email protected]>
Co-authored-by: alexsin368 <[email protected]>
7152e1d
to
da12881
Compare
@louie-tsai |
Description
To help customers debug OPEA issues, we create a Triage Tool to test and gather needed information for debugging.
Mainly target on Docker Compose scenarios.
For the first draft, it is for ChatQnA.
How to Use:
Under root folder of OPEA Examples.
Xeon :
python .Triage.py ChatQnA/testsChatQnA_Xeon.json
Gaudi :
python .Triage.py ChatQnA/tests/ChatQnA_Gaudi.json
only need to change json file for different architectures like Gaudi
The Triage Tools will run some simple tests including:
Below information will be gather after above testings
Report:
plan to have a HTML report.
Right now we have
console output
Below is the screenshot for console output.
Xeon
Gaudi
html report with all docker logs embedded and profile log for vllm
Xeon
Gaudi
Other info:
ToDo:
Issues
n/a
.Type of change
List the type of change like below. Please delete options that are not relevant.
Dependencies
NA
Tests
Manual Testing
vllm profiling result from Triage Tool