You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There is no model configuration in .env in bee-python SDK, it runs against the API, the credentials are the API credentials and runs on whatever provider was set in the API. If you run it locally, you need to run the stack, where you configure the backend and if you run it against dev/prod, the backend are always internal vLLM services.
If you plug in OpenAI credentials, you're using OpenAI API not Bee 🙈
I think it's clear from the first few lines of readme, but let me know if there is a way how to make this better
I think adding a note for external users that the bee-stack needs to be running locally would be useful (I didn't know that the first time I ran the sdk 😄 )
Description
A number of users believe the python SDK can only be used with OpenAI credentials.
Proposed solution
The text was updated successfully, but these errors were encountered: