Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llm): Determine the best LLM deployment config automatically #2396

Open
gaocegege opened this issue Jul 25, 2024 · 9 comments
Open

feat(llm): Determine the best LLM deployment config automatically #2396

gaocegege opened this issue Jul 25, 2024 · 9 comments
Assignees
Labels
area/llm LLMs related content help wanted Extra attention is needed kind/feature lifecycle/stale

Comments

@gaocegege
Copy link
Member

What you would like to be added?

Inspired by this research paper Vidur: A Large-Scale Simulation Framework For LLM Inference

Optimizing the deployment of Large language models (LLMs) is expensive today since it requires experimentally running an application workload against an LLM implementation while exploring large configuration space formed by system knobs such as parallelization strategies, batching techniques, and scheduling policies.

we present Vidur-Search, a configuration search tool that helps optimize LLM deployment. Vidur-Search uses Vidur
to automatically identify the most cost-effective deployment configuration that meets application performance
constraints. For example, Vidur-Search finds the best deployment configuration for LLaMA2-70B in one hour on
a CPU machine, in contrast to a deployment-based exploration which would require 42K GPU hours – costing
218K dollars.

Why is this needed?

Not sure if it is in the scope of katib, but glad to raise an issue here.

Love this feature?

Give it a 👍 We prioritize the features with most 👍

@Electronic-Waste
Copy link
Member

Electronic-Waste commented Aug 3, 2024

I guess it may belong to the scope of KServe, since Katib focuses on the hyperparameters tuning of models :)

@gaocegege
Copy link
Member Author

It's more like a tuning job. You can consider tuning the deployment configs. (e.g. distributed strategy)

@andreyvelich
Copy link
Member

Thank you for creating this @gaocegege!

Yes, I think optimization of LLM Deployment makes sense since Katib is able to perform any optimization task (not even ML) and orchestrate any resources as Trials.

It would be nice to get someone from the Kubeflow community who can explore the Vidur aspects and see how Katib can be useful.

/help
/area llm
/remove-label lifecycle/needs-triage

Copy link

@andreyvelich:
This request has been marked as needing help from a contributor.

Please ensure the request meets the requirements listed here.

If this request no longer meets these requirements, the label can be removed
by commenting with the /remove-help command.

In response to this:

Thank you for creating this @gaocegege!

Yes, I think optimization of LLM Deployment makes sense since Katib is able to perform any optimization task (not even ML) and orchestrate any resources as Trials.

It would be nice to get someone from the Kubeflow community who can explore the Vidur aspects and see how Katib can be useful.

/help
/area llm
/remove-label lifecycle/needs-triage

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@google-oss-prow google-oss-prow bot added the area/llm LLMs related content label Aug 5, 2024
Copy link

@andreyvelich: The label(s) /remove-label lifecycle/needs-triage cannot be applied. These labels are supported: tide/merge-method-merge, tide/merge-method-rebase, tide/merge-method-squash. Is this label configured under labels -> additional_labels or labels -> restricted_labels in plugin.yaml?

In response to this:

Thank you for creating this @gaocegege!

Yes, I think optimization of LLM Deployment makes sense since Katib is able to perform any optimization task (not even ML) and orchestrate any resources as Trials.

It would be nice to get someone from the Kubeflow community who can explore the Vidur aspects and see how Katib can be useful.

/help
/area llm
/remove-label lifecycle/needs-triage

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository.

@google-oss-prow google-oss-prow bot added the help wanted Extra attention is needed label Aug 5, 2024
@gjyotin305
Copy link

gjyotin305 commented Sep 12, 2024

@gaocegege @andreyvelich I would love to look into this,can I work on this ?

/assign

@andreyvelich
Copy link
Member

Yes, that would be amazing @gjyotin305!
If you want, feel free to propose this topic in the AutoML and Training WG call when you explore it.

/assign @gjyotin305

@gjyotin305
Copy link

Sure

Copy link

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area/llm LLMs related content help wanted Extra attention is needed kind/feature lifecycle/stale
Projects
None yet
Development

No branches or pull requests

4 participants