You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible.
Please make sure you have given us as much context as possible.
非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
🥰 Description of requirements
Can it support configuring the model on the server side and whether to use the client request mode?
🧐 Solution
Add an environment variable to control whether client request mode is enabled.
📝 Supplementary information
If -language_model_settings is configured, the user will not be able to modify whether to use client request mode
After testing, the locally deployed Ollama model cannot be used if the client request mode is turned off. It must be turned on.
Bot detected the issue body's language is not English, translate it automatically. 👯👭🏻🧑🤝🧑👫🧑🏿🤝🧑🏻👩🏾🤝👨🏿👬🏿
Currently, the Google AI Studio model does not support client request mode, resulting in request timeout issues in the deployment of Vercel when the Context is large.
🥰 需求描述
能否支持在服务端配置模型是否使用客户端请求模式
🧐 解决方案
添加一个环境变量来控制是否启用客户端请求模式.
📝 补充信息
如果配置了-language_model_settings,用户将无法修改是否使用客户端请求模式
经过测试,本地部署的Ollama模型如果关闭客户端请求模式将无法使用,必须开启才行.
The text was updated successfully, but these errors were encountered: