Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support turbomind bf16 #803

Merged
merged 23 commits into from
Dec 15, 2023
Merged

Support turbomind bf16 #803

merged 23 commits into from
Dec 15, 2023

Conversation

grimoire
Copy link
Collaborator

@grimoire grimoire commented Dec 5, 2023

Thanks for your contribution and we appreciate it a lot. The following instructions would make your pull request more healthy and more easily receiving feedbacks. If you do not understand some items, don't worry, just make the pull request and seek help from maintainers.

Motivation

Please describe the motivation of this PR and the goal you want to achieve through this PR.

Modification

Please briefly describe what modification is made in this PR.

BC-breaking (Optional)

Does the modification introduce changes that break the backward-compatibility of the downstream repositories?
If so, please describe how it breaks the compatibility and how the downstream projects should modify their code to keep compatibility with this PR.

Use cases (Optional)

If this PR introduces a new feature, it is better to list some use cases here, and update the documentation.

Checklist

  1. Pre-commit or other linting tools are used to fix the potential lint issues.
  2. The modification is covered by complete unit tests. If not, please add more unit tests to ensure the correctness.
  3. If the modification has a dependency on downstream projects of a newer version, this PR should be tested with all supported versions of downstream projects.
  4. The documentation has been modified accordingly, like docstring or example tutorials.

@lvhan028 lvhan028 requested a review from RunningLeon December 10, 2023 13:28
@lvhan028
Copy link
Collaborator

@RunningLeon 请测试这个PR在以下模型上的推理精度:

  • llama2-chat(huggingface)
  • internlm-chat-20b
  • internlm-chat-20b-4bit

@lvhan028 lvhan028 requested a review from lzhangzz December 10, 2023 13:32
@lvhan028
Copy link
Collaborator

lvhan028 commented Dec 10, 2023

@AllentDan 请编译出whl包,在v100上验证推理过程

@lvhan028 lvhan028 requested a review from AllentDan December 10, 2023 13:33
@lvhan028
Copy link
Collaborator

@irexyc please check converter part

@AllentDan
Copy link
Collaborator

@AllentDan 请编译出whl包,在v100上验证推理过程

v100 试了 OK

@irexyc
Copy link
Collaborator

irexyc commented Dec 12, 2023

@irexyc please check converter part

试了qwen的。

会有w4bf16的情况么?

@lvhan028
Copy link
Collaborator

@AllentDan 请编译出whl包,在v100上验证推理过程

v100 试了 OK

whl包是打开了 BF16 编译开关的么

@AllentDan
Copy link
Collaborator

@AllentDan 请编译出whl包,在v100上验证推理过程

v100 试了 OK

whl包是打开了 BF16 编译开关的么

CUDA 11+ 默认打开的吧

lvhan028
lvhan028 previously approved these changes Dec 13, 2023
Copy link
Collaborator

@RunningLeon RunningLeon left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@lvhan028 lvhan028 self-requested a review December 13, 2023 10:45
@lvhan028 lvhan028 dismissed their stale review December 13, 2023 11:49

comments are not resolved yet

@lvhan028
Copy link
Collaborator

I tried baichuan2, the weight_type inconfig.ini is fp16, but the "torch_dtype" in config.json is "bfloat16"

@lvhan028 lvhan028 merged commit 3295eac into InternLM:main Dec 15, 2023
7 of 9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants