Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft: Enable new model id's #886

Draft
wants to merge 10 commits into
base: main
Choose a base branch
from
Draft

Draft: Enable new model id's #886

wants to merge 10 commits into from

Conversation

balthazur
Copy link
Contributor

@balthazur balthazur commented Dec 17, 2024

Description

Goal of the PR is to make new model id's from Roboflow work in Inference.

  • getWeights endpoint is updated on staging and prod, does not require a workspace id anymore
  • updated so get_model_data does not send a workspace id anymore (and does not make additional api call to resolve workspace url)
  • change the logic to only call the new endpoint when the model id contains no slashes, for safer migration
  • tests

Note:

As new model urls will be unique within a workspace only (same as workflow urls), we need a way to have a unique cache key on Inference, were we thought about hashing the api key + model id, as we will not have workspaceId available without an additional api call. For public models that doesn't seem to work however, because the api key used for a public model does not represent the owner. However for the first iteration this is fine because we'll only be serving uploaded models from the user itself and Roboflow Instant (owlv2) models with new model id's.

Type of change

Please delete options that are not relevant.

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • This change requires a documentation update

How has this change been tested, please provide a testcase or example of how you tested the change?

This code queries a new model from the models collection

model_id = "model-test-yolov8" # new models collection
url = "https://media.roboflow.com/inference/people-walking.jpg"
model = get_model(model_id=model_id, api_key="")
results = model.infer(url, confidence=0.30, iou_threshold=0.3)[0]

#889 enables to use the model id "manual-owlvit" to use a fine prompted owlv2 model that's saved in models collection on staging.

Any specific deployment considerations

For example, documentation changes, usability, usage/costs, secrets, etc.

Docs

  • Docs updated? What were the changes:

@balthazur balthazur self-assigned this Dec 17, 2024
@CLAassistant
Copy link

CLAassistant commented Dec 17, 2024

CLA assistant check
All committers have signed the CLA.

@balthazur balthazur changed the title Draf: Enable new model id's Draft: Enable new model id's Dec 17, 2024
@probicheaux probicheaux mentioned this pull request Dec 18, 2024
4 tasks
@balthazur
Copy link
Contributor Author

@grzegorz-roboflow The new getWeights endpoint is up on staging and prod, and I posted an example above in the PR description with an example staging project. You'll need my staging api key though.

It now doesn't require to pass a workspace id anymore, so I also removed the get_workspace call that I added before from this PR.

@hansent
Copy link
Contributor

hansent commented Dec 19, 2024

inferencer endpoints also assume dataset_id/version_id format: https://github.com/roboflow/inference/blob/main/inference/core/interfaces/http/http_api.py#L2170C14-L2170C57

which probably needs to change / we need a new endpoint that can handle new model_ids

@grzegorz-roboflow grzegorz-roboflow self-assigned this Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants