.
โโโ convert_tf_checkpoint_to_pytorch.py
โโโ uncased_L-12_H-768_A-12
โ โโโ bert_config.json
โ โโโ bert_model.ckpt.data-00000-of-00001
โ โโโ bert_model.ckpt.index
โ โโโ bert_model.ckpt.meta
โ โโโ vocab.txt
โโโ uncased_L-12_H-768_A-12.zip
โโโ Untitled.ipynb
(base) โ ckpt_to_bin git:(master) โ python convert.py --tf_checkpoint_path=./uncased_L-12_H-768_A-12 --bert_config_file=./uncased_L-12_H-768_A-12/bert_config.json --pytorch_dump_path=./uncased_L-12_H-768_A-12
Building PyTorch model from configuration: {
"attention_probs_dropout_prob": 0.1,
"finetuning_task": null,
"hidden_act": "gelu",
"hidden_dropout_prob": 0.1,
"hidden_size": 768,
"initializer_range": 0.02,
"intermediate_size": 3072,
"layer_norm_eps": 1e-12,
"max_position_embeddings": 512,
"num_attention_heads": 12,
"num_hidden_layers": 12,
"num_labels": 2,
"output_attentions": false,
"output_hidden_states": false,
"torchscript": false,
"type_vocab_size": 2,
"vocab_size": 30522
}
INFO:pytorch_transformers.modeling_bert:Converting TensorFlow checkpoint from /home/zxr/summary/bertsum/src/ckpt_to_bin/uncased_L-12_H-768_A-12
Traceback (most recent call last):
File "convert.py", line 65, in <module>
args.pytorch_dump_path)
File "convert.py", line 36, in convert_tf_checkpoint_to_pytorch
load_tf_weights_in_bert(model, config, tf_checkpoint_path)
File "/home/zxr/anaconda3/lib/python3.7/site-packages/pytorch_transformers/modeling_bert.py", line 83, in load_tf_weights_in_bert
init_vars = tf.train.list_variables(tf_path)
File "/home/zxr/anaconda3/lib/python3.7/site-packages/tensorflow/python/training/checkpoint_utils.py", line 95, in list_variables
reader = load_checkpoint(ckpt_dir_or_file)
File "/home/zxr/anaconda3/lib/python3.7/site-packages/tensorflow/python/training/checkpoint_utils.py", line 63, in load_checkpoint
"given directory %s" % ckpt_dir_or_file)
ValueError: Couldn't find 'checkpoint' file or checkpoints in given directory /home/zxr/summary/bertsum/src/ckpt_to_bin/uncased_L-12_H-768_A-12
Most helpful comment