You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Mar 17, 2021. It is now read-only.
At each iteration, batch_size windows are extracted from the queue, they're fed to the network and the weights are updated using the gradient of the loss function with respect to the weights.
The concept of epoch doesn't apply in this case, because NiftyNet uses patch-based training (patch and window are the same thing).
I am not clear about what the iteration means. Could you explain more about it? Also, how is it related to epoch?
Thanks.
The text was updated successfully, but these errors were encountered: