You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I try to use a pretrain model to initial my own model's weight. Both of them are nngraph and the network structures are almost the same. I expect after the initialization, they should have the same outputs. After copying the weights, I forward both network with the same random input, but the outputs are different. I am sure that I have correctly copy the weights from one to another. This is done by calling "torch.sum(torch.ne(param1, param2))" where the param1, param2 are the parameters of the two network. The outputs are different after they forward nn.SpatialBatchNormaliztion. I am sure that the params in nn.SpatialBatchNormalization are the same. Is there any way to fix this? Or Is there a better way to directly modify the structure of a 'nngraph' model?
The text was updated successfully, but these errors were encountered:
hi there
I try to use a pretrain model to initial my own model's weight. Both of them are nngraph and the network structures are almost the same. I expect after the initialization, they should have the same outputs. After copying the weights, I forward both network with the same random input, but the outputs are different. I am sure that I have correctly copy the weights from one to another. This is done by calling "torch.sum(torch.ne(param1, param2))" where the param1, param2 are the parameters of the two network. The outputs are different after they forward nn.SpatialBatchNormaliztion. I am sure that the params in nn.SpatialBatchNormalization are the same. Is there any way to fix this? Or Is there a better way to directly modify the structure of a 'nngraph' model?
The text was updated successfully, but these errors were encountered: