Transformers: TypeError: forward() got an unexpected keyword argument 'head_mask'

Created on 16 Jul 2020  路  1Comment  路  Source: huggingface/transformers

I'm getting the above error while training for EncoderDecoder Model for longformer.

Following is the code snippet from the training loop, followed by the model definition.

output = model(input_ids = b_input_ids, attention_mask = b_input_masks, decoder_input_ids = b_decoder_input_ids, decoder_attention_mask = b_decoder_input_masks )
model = EncoderDecoderModel.from_encoder_decoder_pretrained('allenai/longformer-base-4096','allenai/longformer-base-4096')
The crazy thing is I've not even defined 'head_mask' and left it to take its default value of 'none'.

Following the complete error.
`---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
in ()
32 optimizer.zero_grad()
33
---> 34 output = model(input_ids = b_input_ids, attention_mask = b_input_masks, head_mask = None,decoder_input_ids = b_decoder_input_ids, decoder_attention_mask = b_decoder_input_masks )
35 loss = output[0]
36

2 frames
/usr/local/lib/python3.6/dist-packages/torch/nn/modules/module.py in __call__(self, input, *kwargs)
548 result = self._slow_forward(input, *kwargs)
549 else:
--> 550 result = self.forward(input, *kwargs)
551 for hook in self._forward_hooks.values():
552 hook_result = hook(self, input, result)

TypeError: forward() got an unexpected keyword argument 'head_mask'`

>All comments

The EncoderDecoderModel does not work with longformer yet. Closing this in favor of https://github.com/huggingface/transformers/issues/4225 .

Was this page helpful?
0 / 5 - 0 ratings

Related issues

guanlongtianzi picture guanlongtianzi  路  3Comments

adigoryl picture adigoryl  路  3Comments

fyubang picture fyubang  路  3Comments

siddsach picture siddsach  路  3Comments

HanGuo97 picture HanGuo97  路  3Comments