Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Learning from small number of examples #2

Open
sujeongkim opened this issue Jan 29, 2023 · 0 comments
Open

Learning from small number of examples #2

sujeongkim opened this issue Jan 29, 2023 · 0 comments

Comments

@sujeongkim
Copy link

I've tried training the network on a single graph. (Training on a single graph and testing on the same graph)
I expected that the network would almost memorize the graph so the loss would be very close to zero. Surprisingly, the loss was about 0.06-0.07, which is more than 20-30 times higher than when I train on 1000 samples). Test loss sometimes is much larger.
I repeated this experiment varying the number of nodes in the graph from 4 to 200, and they all look similar.
I don't have much experience in GNNs and machine learning, but is this normal?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant