You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is more related to the underlying model that you are applying LoRA to. Since activations only (without gradients) from many layers will allocate huge memory, then LoRA will not be able to reduce the memory consumption even if you are fine-tuning several thousands parameters.
how to improve the memory ability of lora fine tuning?
The text was updated successfully, but these errors were encountered: