Transformers: _load_from_state_dict() takes 7 positional arguments but 8 were given

Created on 17 Dec 2018  路  3Comments  路  Source: huggingface/transformers

Most helpful comment

This is caused by pytorch version.
I found , In 0.4.0 version, _load_from_state_dict() only take 7 arguments, but In 0.4.1 and this code, we need feed 8 arguments.

module._load_from_state_dict(
                state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)

local_metadata should be removed in pytorch 0.4.0

All 3 comments

Full log of the error?

This is caused by pytorch version.
I found , In 0.4.0 version, _load_from_state_dict() only take 7 arguments, but In 0.4.1 and this code, we need feed 8 arguments.

module._load_from_state_dict(
                state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)

local_metadata should be removed in pytorch 0.4.0

Ok thanks @SummmerSnow !

Was this page helpful?
0 / 5 - 0 ratings

Related issues

stas00 picture stas00  路  34Comments

Palipoor picture Palipoor  路  28Comments

ZhuoranLyu picture ZhuoranLyu  路  31Comments

zhaoxy92 picture zhaoxy92  路  41Comments

angelorodem picture angelorodem  路  35Comments