Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test failure on windows #78

Open
Bartzi opened this issue Apr 13, 2018 · 0 comments
Open

test failure on windows #78

Bartzi opened this issue Apr 13, 2018 · 0 comments
Labels
bug Something isn't working testing Tests, continuous integration, coverage

Comments

@Bartzi
Copy link
Member

Bartzi commented Apr 13, 2018

The test test_train_main fails on Windows:

    def test_train():
        np.seterr(divide='raise')

        for optimizer in [SGD(0.001), Adam(0.001)]:
            for data_set in [MNIST(64), FashionMNIST(64)]:
                for i in range(10):
                    model = MLP()

                    for iteration, batch in enumerate(data_set.train):
                        model.forward(batch)
>                       model.backward(optimizer)

length\tests\test_train_main.py:18:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
length\models\mlp.py:32: in backward
    self.loss.backward(optimizer)
length\graph.py:45: in backward
    gradients = candidate_layer.creator.backward(candidate_layer.grad)
length\layer.py:28: in backward
    parameter_deltas = self.optimizer.run_update_rule(parameter_gradients, self)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <length.optimizers.adam.Adam object at 0x07C85370>
gradients = (array([[0., 0., 0., ..., 0., 0., 0.],
       [0., 0., 0., ..., 0., 0., 0.],
       [0., 0., 0., ..., 0., 0., 0.],
   ...-02, -4.15754272e-03,
       -3.68550362e-04, -4.95784450e-03, -7.36804074e-03, -5.72426897e-03],
      dtype=float32))
layer = <length.layers.fully_connected.FullyConnected object at 0x07C855B0>

    def run_update_rule(self, gradients, layer):
        self.current_id = id(layer)

        if not self.initialized:
            self.initialize(gradients)

        self.t_values[self.current_id] += 1
        t = self.t_values[self.current_id]

        param_deltas = []

        for i, gradient in enumerate(gradients):
            m = self.m_values[self.current_id][i]
            v = self.v_values[self.current_id][i]

>           m += (1 - self.beta1) * (gradient - m)
E           ValueError: operands could not be broadcast together with shapes (512,784) (10,512)

length\optimizers\adam.py:47: ValueError

It does not fail on linux 😕

@Bartzi Bartzi added bug Something isn't working testing Tests, continuous integration, coverage labels Apr 13, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working testing Tests, continuous integration, coverage
Projects
None yet
Development

No branches or pull requests

1 participant