-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Integrate replicate with Hugging Face transformers #498
Comments
Nice, great idea! 🙌 We're going to contribute the PyTorch Lightning integration back to PyTorch Lightning. We included in our repository as a shortcut, because the integration needed changes to their callback API. Looks like Hugging Face also include callbacks for services in their library, so it probably makes sense for it to live there. I wonder whether we should include it now or wait until our API is a bit more stable. I can imagine us being able to build in backwards compatibility if we ever change something. As a place to develop it, I wonder if it makes sense to create a separate package? Also open to including it in Replicate-proper, but this might be a neater solution if we intend to contribute it upstream to Hugging Face at some point... |
@srush Do you have any opinions about this, or know someone at Hugging Face who might? :) |
I will check! |
Bumping this. Is there a streamlined process to migrate a huggingface model to run on Replicate? |
Similar to Pytorch Lightning, Hugging Face abstracts away the training loop and provides a Trainer API with callbacks
I will probably be able to take this on in the next few weeks.
How are you thinking about keeping callbacks in this repo (for both this and previous ones (keras, PL)) vs contributing to those library repos like tensorboard and w&b do? Keep in here until a stable v1 release?
The text was updated successfully, but these errors were encountered: