You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've tried training the network on a single graph. (Training on a single graph and testing on the same graph)
I expected that the network would almost memorize the graph so the loss would be very close to zero. Surprisingly, the loss was about 0.06-0.07, which is more than 20-30 times higher than when I train on 1000 samples). Test loss sometimes is much larger.
I repeated this experiment varying the number of nodes in the graph from 4 to 200, and they all look similar.
I don't have much experience in GNNs and machine learning, but is this normal?
The text was updated successfully, but these errors were encountered:
I've tried training the network on a single graph. (Training on a single graph and testing on the same graph)
I expected that the network would almost memorize the graph so the loss would be very close to zero. Surprisingly, the loss was about 0.06-0.07, which is more than 20-30 times higher than when I train on 1000 samples). Test loss sometimes is much larger.
I repeated this experiment varying the number of nodes in the graph from 4 to 200, and they all look similar.
I don't have much experience in GNNs and machine learning, but is this normal?
The text was updated successfully, but these errors were encountered: