Skip to content

Multi-round inference and GPU memory usage #1349

Answered by janfb
ali-akhavan89 asked this question in Q&A
Discussion options

You must be logged in to vote

Hallo @ali-akhavan89 ,

the reason is that each round new simulations are added to the inference object and saved internally here:

self._theta_roundwise.append(theta)
self._x_roundwise.append(x)
self._prior_masks.append(prior_masks)

In each round, all data from all previous rounds is then used for training, see here:

start_idx = int(discard_prior_samples and self._round > 0)
# For non-atomic loss, we can not reuse samples from previous rounds as of now.
# SNPE-A can, by construction of the algori…

Replies: 1 comment 1 reply

Comment options

You must be logged in to vote
1 reply
@ali-akhavan89
Comment options

Answer selected by ali-akhavan89
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants