Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Memory usage is close to the limit. #4

Open
hiqmatNisa opened this issue May 27, 2019 · 1 comment
Open

Memory usage is close to the limit. #4

hiqmatNisa opened this issue May 27, 2019 · 1 comment

Comments

@hiqmatNisa
Copy link

Hi,
thanks for this great work.
I am running this code on google colab on 12GB RAM. While loading the training set I am getting below error.
Your runtime is close to its memory limit. Further use may cause an out-of-memory crash and the loss of in-memory state for all sessions. Would you like to terminate some sessions in order to free up memory (state will be lost for those sessions)? Currently, 10.62 GB / 12.72 GB is being used.
Can you please let us know about the running requirements(RAM, GPU) of this code?
and how much time it takes for 250 epochs?

@josarajar
Copy link
Owner

Hi,
Sorry for the delay, the requirements of RAM memory for the current batch size is of 15 GB. If you don't have enough RAM you can decrease the batch size.

Regarding the running time, in a Nvidia P100 the time for 250 epochs is about 4 days of training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants