This repository has been archived by the owner on Oct 31, 2023. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 124
How does one return an adapted model without using the context manager? #119
Comments
in particular, I'd also like to do this during testing. So the k memory footprint really is bad for me. So I just want to get the weights after the adaptation and throw everything away. With no memory footprint at all. |
relevant: https://github.com/facebookresearch/higher/blob/main/higher/__init__.py
|
and
|
something like this should work:
just returning the functional model and not using the context manager. |
if something subtle has to be done that the context manager is doing let me know pls! |
doesn't seem to work for some reason even though the code runs. model diverges:
|
memory issues perhaps? #75 |
my divergance is caused by the data leakage: #107 removing the .train() flag is what makes the errors. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I want to adapt my model and pass the adapted model around in my code. How do I do this?
my guess the best way is to not use the context manager for inner loops but somehow still use the adapted model.
The text was updated successfully, but these errors were encountered: