Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

在env中配置了internlm/internlm2_5-7b-chat这个模型,运行报错 #274

Open
cleven1 opened this issue Dec 16, 2024 · 1 comment
Open

Comments

@cleven1
Copy link

cleven1 commented Dec 16, 2024

在环境中配置模型SILICON_MODEL=internlm/internlm2_5-7b-chat, 运行报如下错误

aise EnvironmentError(
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like internlm/internlm2_5-7b-chat is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'

不太明白为什么会到huggingface去下载, 已经配置了SILICON_API_KEY和SILICON_MODEL, 这是哪里出了问题?

@Harold-lkk
Copy link
Collaborator

启动命令不对吧

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants