We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The test test_train_main fails on Windows:
test_train_main
def test_train(): np.seterr(divide='raise') for optimizer in [SGD(0.001), Adam(0.001)]: for data_set in [MNIST(64), FashionMNIST(64)]: for i in range(10): model = MLP() for iteration, batch in enumerate(data_set.train): model.forward(batch) > model.backward(optimizer) length\tests\test_train_main.py:18: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ length\models\mlp.py:32: in backward self.loss.backward(optimizer) length\graph.py:45: in backward gradients = candidate_layer.creator.backward(candidate_layer.grad) length\layer.py:28: in backward parameter_deltas = self.optimizer.run_update_rule(parameter_gradients, self) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <length.optimizers.adam.Adam object at 0x07C85370> gradients = (array([[0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], [0., 0., 0., ..., 0., 0., 0.], ...-02, -4.15754272e-03, -3.68550362e-04, -4.95784450e-03, -7.36804074e-03, -5.72426897e-03], dtype=float32)) layer = <length.layers.fully_connected.FullyConnected object at 0x07C855B0> def run_update_rule(self, gradients, layer): self.current_id = id(layer) if not self.initialized: self.initialize(gradients) self.t_values[self.current_id] += 1 t = self.t_values[self.current_id] param_deltas = [] for i, gradient in enumerate(gradients): m = self.m_values[self.current_id][i] v = self.v_values[self.current_id][i] > m += (1 - self.beta1) * (gradient - m) E ValueError: operands could not be broadcast together with shapes (512,784) (10,512) length\optimizers\adam.py:47: ValueError
It does not fail on linux 😕
The text was updated successfully, but these errors were encountered:
No branches or pull requests
The test
test_train_main
fails on Windows:It does not fail on linux 😕
The text was updated successfully, but these errors were encountered: