Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Codespace Creation error 400 (400 Bad Request) with error: InsufficientQuota: #880

Open
ediaz-caio opened this issue Oct 7, 2024 · 1 comment
Assignees
Labels
question Further information is requested

Comments

@ediaz-caio
Copy link

New code space error

Error: creating Deployment (Subscription: ""
│ Resource Group Name: "infoasst-test"
│ Account Name: "infoasst-aoai"
│ Deployment Name: "text-embedding-ada-002"): performing CreateOrUpdate: unexpected status 400 (400 Bad Request) with error: InsufficientQuota: This operation require 240 new capacity in quota Tokens Per Minute (thousands) - Text-Embedding-Ada-002, which is bigger than the current available capacity 0. The current quota usage is 240 and the quota limit is 240 for quota Tokens Per Minute (thousands) - Text-Embedding-Ada-002.

│ with module.openaiServices.azurerm_cognitive_deployment.deployment[1],
│ on core/ai/openaiservices/openaiservices.tf line 27, in resource "azurerm_cognitive_deployment" "deployment":
│ 27: resource "azurerm_cognitive_deployment" "deployment" {

│ creating Deployment (Subscription: ""
│ Resource Group Name: "infoasst-test"
│ Account Name: "infoasst-aoai"
│ Deployment Name: "text-embedding-ada-002"): performing CreateOrUpdate: unexpected status 400 (400 Bad Request) with error: InsufficientQuota: This operation require 240 new capacity in quota Tokens Per Minute
│ (thousands) - Text-Embedding-Ada-002, which is bigger than the current available capacity 0. The current quota usage is 240 and the quota limit is 240 for quota Tokens Per Minute (thousands) - Text-Embedding-Ada-002.

make: *** [Makefile:22: infrastructure] Error 1

@dayland
Copy link
Contributor

dayland commented Oct 8, 2024

This is a quota issue with your Azure OpenAI services. Azure Open AI quota is subscription wide, meaning if you have more than one instance of Azure OpenAI and/or more than one deployment of the same base model, the TPM quota is the combination across all instances in a subscription. This can lead to quota limits being consumed as reported in the error you provided. Please check to see how many instance you have deployed and total consumed quota for the "text-embeddings-ada-02" model.

You can reduce existing deployed instances, delete unused deployments, or request a quota increase in the Azure Portal. Also be aware that deleted model deployments and instances my not return quota imediately (#282 (comment) )

@dayland dayland added the question Further information is requested label Oct 8, 2024
@dayland dayland self-assigned this Oct 8, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants