Transformers: _load_from_state_dict() takes 7 positional arguments but 8 were given

Created on 17 Dec 2018  路  3Comments  路  Source: huggingface/transformers

Most helpful comment

This is caused by pytorch version.
I found , In 0.4.0 version, _load_from_state_dict() only take 7 arguments, but In 0.4.1 and this code, we need feed 8 arguments.

module._load_from_state_dict(
                state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)

local_metadata should be removed in pytorch 0.4.0

All 3 comments

Full log of the error?

This is caused by pytorch version.
I found , In 0.4.0 version, _load_from_state_dict() only take 7 arguments, but In 0.4.1 and this code, we need feed 8 arguments.

module._load_from_state_dict(
                state_dict, prefix, local_metadata, True, missing_keys, unexpected_keys, error_msgs)

local_metadata should be removed in pytorch 0.4.0

Ok thanks @SummmerSnow !

Was this page helpful?
0 / 5 - 0 ratings

Related issues

chuanmingliu picture chuanmingliu  路  3Comments

yspaik picture yspaik  路  3Comments

fyubang picture fyubang  路  3Comments

iedmrc picture iedmrc  路  3Comments

0x01h picture 0x01h  路  3Comments