Unused parameters error post upgrade. #5509
-
So I am using two modules like this self._query_embedder = BasicTextFieldEmbedder({
'tokens': PretrainedTransformerEmbedder(transformer_model, last_layer_only=False),
})
self._document_embedder = BasicTextFieldEmbedder({
'tokens': PretrainedTransformerEmbedder(transformer_model, last_layer_only=False),
}) This worked fine in allennlp 2.3. But on a recent upgrade I get the following error:
The only relevant issue that I found is #5228 So I added the following to my trainer
And now it no longer shows that error. I am a bit confused as to what is happening here and why my solution worked. Any explanation would be appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Based on the error message, it sounds like the |
Beta Was this translation helpful? Give feedback.
Based on the error message, it sounds like the
pooler
layer within your_query_embedder
and_document_embedder
aren't part of the graph that leads to the loss, which is not surprising. Depending on which transformer model you're using you may be able to disable thepooler
layer, which should avoid this error without having to setfind_unused_parameters
toTrue
. WithBERT
, for example, see here: https://github.com/huggingface/transformers/blob/ca0b82bbd7eadd4f91e48ebedbf647d12b654f38/src/transformers/models/bert/modeling_bert.py#L864