Transformers: can not convert_tf_checkpoint_to_pytorch

Created on 24 Jul 2019  ยท  1Comment  ยท  Source: huggingface/transformers

.
โ”œโ”€โ”€ convert_tf_checkpoint_to_pytorch.py
โ”œโ”€โ”€ uncased_L-12_H-768_A-12
โ”‚   โ”œโ”€โ”€ bert_config.json
โ”‚   โ”œโ”€โ”€ bert_model.ckpt.data-00000-of-00001
โ”‚   โ”œโ”€โ”€ bert_model.ckpt.index
โ”‚   โ”œโ”€โ”€ bert_model.ckpt.meta
โ”‚   โ””โ”€โ”€ vocab.txt
โ”œโ”€โ”€ uncased_L-12_H-768_A-12.zip
โ””โ”€โ”€ Untitled.ipynb
(base) โžœ  ckpt_to_bin git:(master) โœ— python convert.py --tf_checkpoint_path=./uncased_L-12_H-768_A-12 --bert_config_file=./uncased_L-12_H-768_A-12/bert_config.json --pytorch_dump_path=./uncased_L-12_H-768_A-12
Building PyTorch model from configuration: {
  "attention_probs_dropout_prob": 0.1,
  "finetuning_task": null,
  "hidden_act": "gelu",
  "hidden_dropout_prob": 0.1,
  "hidden_size": 768,
  "initializer_range": 0.02,
  "intermediate_size": 3072,
  "layer_norm_eps": 1e-12,
  "max_position_embeddings": 512,
  "num_attention_heads": 12,
  "num_hidden_layers": 12,
  "num_labels": 2,
  "output_attentions": false,
  "output_hidden_states": false,
  "torchscript": false,
  "type_vocab_size": 2,
  "vocab_size": 30522
}

INFO:pytorch_transformers.modeling_bert:Converting TensorFlow checkpoint from /home/zxr/summary/bertsum/src/ckpt_to_bin/uncased_L-12_H-768_A-12
Traceback (most recent call last):
  File "convert.py", line 65, in <module>
    args.pytorch_dump_path)
  File "convert.py", line 36, in convert_tf_checkpoint_to_pytorch
    load_tf_weights_in_bert(model, config, tf_checkpoint_path)
  File "/home/zxr/anaconda3/lib/python3.7/site-packages/pytorch_transformers/modeling_bert.py", line 83, in load_tf_weights_in_bert
    init_vars = tf.train.list_variables(tf_path)
  File "/home/zxr/anaconda3/lib/python3.7/site-packages/tensorflow/python/training/checkpoint_utils.py", line 95, in list_variables
    reader = load_checkpoint(ckpt_dir_or_file)
  File "/home/zxr/anaconda3/lib/python3.7/site-packages/tensorflow/python/training/checkpoint_utils.py", line 63, in load_checkpoint
    "given directory %s" % ckpt_dir_or_file)
ValueError: Couldn't find 'checkpoint' file or checkpoints in given directory /home/zxr/summary/bertsum/src/ckpt_to_bin/uncased_L-12_H-768_A-12

Most helpful comment

python convert.py --tf_checkpoint_path=./uncased_L-12_H-768_A-12/bert_model.ckpt --bert_config_file=./uncased_L-12_H-768_A-12/bert_config.json --pytorch_dump_path=./uncased_L-12_H-768_A-12/bert_model.bin

>All comments

python convert.py --tf_checkpoint_path=./uncased_L-12_H-768_A-12/bert_model.ckpt --bert_config_file=./uncased_L-12_H-768_A-12/bert_config.json --pytorch_dump_path=./uncased_L-12_H-768_A-12/bert_model.bin
Was this page helpful?
0 / 5 - 0 ratings