Caffe: Failed to open leveldb

Created on 21 Dec 2014  路  5Comments  路  Source: BVLC/caffe

I'm able to run the finetune_flickr_style example without error. However, when I try to run my own network whose prototxt files are copied from the finetune_flickr_style and trivially modified to point to my own image sources, caffe fails with the following error:

I1221 10:16:31.097148 2052915968 data_layer.cpp:45] Opening leveldb 
F1221 10:16:31.097362 2052915968 data_layer.cpp:48] Check failed: status.ok() Failed to open leveldb 
IO error: /LOCK: Permission denied

After some digging, this is happening because this->layer_param_.data_param().source() is empty, which then causes Caffe to attempt to create a leveldb at the path "", which causes leveldb to attempt to open a LOCK file at the root level (which fails as expected).

I haven't successfully resolved this problem yet, so any help would be appreciated. However, it appears that the code needs more parameter validation checks, specifically:

  • In data_layer.cpp (and anywhere else a parameter string is echoed in an info string), the param value should be surrounded with quotes so that it's clear when the parameter is empty:
    LOG(INFO) << "Opening leveldb '" << this->layer_param_.data_param().source() << "'";
  • Whenever the layer definition file will cause Caffe to will need to create a file at a path that is empty, it should error earlier during layer validation and explain the error.

Here's the full log dump:

build/tools/caffe train -solver models/ndsb/solver.prototxt -weights models/bvlc_reference_caffenet/bvlc_reference_caffenet.caffemodel

I1221 11:59:31.962201 2052915968 caffe.cpp:103] Use CPU.
I1221 11:59:31.963217 2052915968 caffe.cpp:107] Starting Optimization
I1221 11:59:31.963232 2052915968 solver.cpp:32] Initializing solver from parameters: 
test_iter: 1000
test_interval: 1000
base_lr: 0.01
display: 20
max_iter: 450000
lr_policy: "step"
gamma: 0.1
momentum: 0.9
weight_decay: 0.0005
stepsize: 100000
snapshot: 10000
snapshot_prefix: "models/ndsb/train_snap"
solver_mode: CPU
net: "models/ndsb/train.prototxt"
I1221 11:59:31.963373 2052915968 solver.cpp:67] Creating training net from net file: models/ndsb/train.prototxt
I1221 11:59:31.965244 2052915968 net.cpp:275] The NetState phase (0) differed from the phase (1) specified by a rule in layer data
I1221 11:59:31.965299 2052915968 net.cpp:275] The NetState phase (0) differed from the phase (1) specified by a rule in layer accuracy
I1221 11:59:31.965312 2052915968 net.cpp:39] Initializing net from parameters: 
name: "NDSB"
layers {
  top: "data"
  top: "label"
  name: "data"
  type: IMAGE_DATA
  image_data_param {
    source: "data/ndsb/train_small_file_list.txt"
    batch_size: 50
    new_height: 256
    new_width: 256
  }
  include {
    phase: TRAIN
  }
  transform_param {
    mirror: true
    crop_size: 227
    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"
  }
}
layers {
  bottom: "data"
  top: "conv1"
  name: "conv1"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "conv1"
  top: "conv1"
  name: "relu1"
  type: RELU
}
layers {
  bottom: "conv1"
  top: "pool1"
  name: "pool1"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool1"
  top: "norm1"
  name: "norm1"
  type: LRN
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layers {
  bottom: "norm1"
  top: "conv2"
  name: "conv2"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv2"
  top: "conv2"
  name: "relu2"
  type: RELU
}
layers {
  bottom: "conv2"
  top: "pool2"
  name: "pool2"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool2"
  top: "norm2"
  name: "norm2"
  type: LRN
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layers {
  bottom: "norm2"
  top: "conv3"
  name: "conv3"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "conv3"
  top: "conv3"
  name: "relu3"
  type: RELU
}
layers {
  bottom: "conv3"
  top: "conv4"
  name: "conv4"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv4"
  top: "conv4"
  name: "relu4"
  type: RELU
}
layers {
  bottom: "conv4"
  top: "conv5"
  name: "conv5"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv5"
  top: "conv5"
  name: "relu5"
  type: RELU
}
layers {
  bottom: "conv5"
  top: "pool5"
  name: "pool5"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool5"
  top: "fc6"
  name: "fc6"
  type: INNER_PRODUCT
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "fc6"
  top: "fc6"
  name: "relu6"
  type: RELU
}
layers {
  bottom: "fc6"
  top: "fc6"
  name: "drop6"
  type: DROPOUT
  dropout_param {
    dropout_ratio: 0.5
  }
}
layers {
  bottom: "fc6"
  top: "fc7"
  name: "fc7"
  type: INNER_PRODUCT
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "fc7"
  top: "fc7"
  name: "relu7"
  type: RELU
}
layers {
  bottom: "fc7"
  top: "fc7"
  name: "drop7"
  type: DROPOUT
  dropout_param {
    dropout_ratio: 0.5
  }
}
layers {
  bottom: "fc7"
  top: "fc8_ndsb"
  name: "fc8_ndsb"
  type: INNER_PRODUCT
  blobs_lr: 10
  blobs_lr: 20
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "fc8_ndsb"
  bottom: "label"
  name: "loss"
  type: SOFTMAX_LOSS
}
state {
  phase: TRAIN
}
I1221 11:59:31.966475 2052915968 net.cpp:67] Creating Layer data
I1221 11:59:31.966497 2052915968 net.cpp:356] data -> data
I1221 11:59:31.966526 2052915968 net.cpp:356] data -> label
I1221 11:59:31.966666 2052915968 net.cpp:96] Setting up data
I1221 11:59:31.966681 2052915968 image_data_layer.cpp:30] Opening file data/ndsb/train_small_file_list.txt
I1221 11:59:31.969635 2052915968 image_data_layer.cpp:45] A total of 2303 images.
I1221 11:59:31.972177 2052915968 image_data_layer.cpp:73] output data size: 50,3,227,227
I1221 11:59:31.972224 2052915968 base_data_layer.cpp:36] Loading mean file fromdata/ilsvrc12/imagenet_mean.binaryproto
I1221 11:59:32.002946 2052915968 net.cpp:103] Top shape: 50 3 227 227 (7729350)
I1221 11:59:32.002979 2052915968 net.cpp:103] Top shape: 50 1 1 1 (50)
I1221 11:59:32.002997 2052915968 net.cpp:67] Creating Layer conv1
I1221 11:59:32.003005 2052915968 net.cpp:394] conv1 <- data
I1221 11:59:32.003021 2052915968 net.cpp:356] conv1 -> conv1
I1221 11:59:32.003036 2052915968 net.cpp:96] Setting up conv1
I1221 11:59:32.003788 2052915968 net.cpp:103] Top shape: 50 96 55 55 (14520000)
I1221 11:59:32.003830 2052915968 net.cpp:67] Creating Layer relu1
I1221 11:59:32.003839 2052915968 net.cpp:394] relu1 <- conv1
I1221 11:59:32.003847 2052915968 net.cpp:345] relu1 -> conv1 (in-place)
I1221 11:59:32.003856 2052915968 net.cpp:96] Setting up relu1
I1221 11:59:32.003864 2052915968 net.cpp:103] Top shape: 50 96 55 55 (14520000)
I1221 11:59:32.003873 2052915968 net.cpp:67] Creating Layer pool1
I1221 11:59:32.003880 2052915968 net.cpp:394] pool1 <- conv1
I1221 11:59:32.003888 2052915968 net.cpp:356] pool1 -> pool1
I1221 11:59:32.003898 2052915968 net.cpp:96] Setting up pool1
I1221 11:59:32.003914 2052915968 net.cpp:103] Top shape: 50 96 27 27 (3499200)
I1221 11:59:32.003926 2052915968 net.cpp:67] Creating Layer norm1
I1221 11:59:32.003931 2052915968 net.cpp:394] norm1 <- pool1
I1221 11:59:32.003940 2052915968 net.cpp:356] norm1 -> norm1
I1221 11:59:32.003950 2052915968 net.cpp:96] Setting up norm1
I1221 11:59:32.003962 2052915968 net.cpp:103] Top shape: 50 96 27 27 (3499200)
I1221 11:59:32.003973 2052915968 net.cpp:67] Creating Layer conv2
I1221 11:59:32.003980 2052915968 net.cpp:394] conv2 <- norm1
I1221 11:59:32.003989 2052915968 net.cpp:356] conv2 -> conv2
I1221 11:59:32.004000 2052915968 net.cpp:96] Setting up conv2
I1221 11:59:32.010432 2052915968 net.cpp:103] Top shape: 50 256 27 27 (9331200)
I1221 11:59:32.010463 2052915968 net.cpp:67] Creating Layer relu2
I1221 11:59:32.010473 2052915968 net.cpp:394] relu2 <- conv2
I1221 11:59:32.010483 2052915968 net.cpp:345] relu2 -> conv2 (in-place)
I1221 11:59:32.010526 2052915968 net.cpp:96] Setting up relu2
I1221 11:59:32.010535 2052915968 net.cpp:103] Top shape: 50 256 27 27 (9331200)
I1221 11:59:32.010545 2052915968 net.cpp:67] Creating Layer pool2
I1221 11:59:32.010551 2052915968 net.cpp:394] pool2 <- conv2
I1221 11:59:32.010566 2052915968 net.cpp:356] pool2 -> pool2
I1221 11:59:32.010577 2052915968 net.cpp:96] Setting up pool2
I1221 11:59:32.010587 2052915968 net.cpp:103] Top shape: 50 256 13 13 (2163200)
I1221 11:59:32.010597 2052915968 net.cpp:67] Creating Layer norm2
I1221 11:59:32.010604 2052915968 net.cpp:394] norm2 <- pool2
I1221 11:59:32.010614 2052915968 net.cpp:356] norm2 -> norm2
I1221 11:59:32.010624 2052915968 net.cpp:96] Setting up norm2
I1221 11:59:32.010632 2052915968 net.cpp:103] Top shape: 50 256 13 13 (2163200)
I1221 11:59:32.010643 2052915968 net.cpp:67] Creating Layer conv3
I1221 11:59:32.010650 2052915968 net.cpp:394] conv3 <- norm2
I1221 11:59:32.010660 2052915968 net.cpp:356] conv3 -> conv3
I1221 11:59:32.010673 2052915968 net.cpp:96] Setting up conv3
I1221 11:59:32.029258 2052915968 net.cpp:103] Top shape: 50 384 13 13 (3244800)
I1221 11:59:32.029296 2052915968 net.cpp:67] Creating Layer relu3
I1221 11:59:32.029305 2052915968 net.cpp:394] relu3 <- conv3
I1221 11:59:32.029316 2052915968 net.cpp:345] relu3 -> conv3 (in-place)
I1221 11:59:32.029326 2052915968 net.cpp:96] Setting up relu3
I1221 11:59:32.029335 2052915968 net.cpp:103] Top shape: 50 384 13 13 (3244800)
I1221 11:59:32.029345 2052915968 net.cpp:67] Creating Layer conv4
I1221 11:59:32.029353 2052915968 net.cpp:394] conv4 <- conv3
I1221 11:59:32.029363 2052915968 net.cpp:356] conv4 -> conv4
I1221 11:59:32.029376 2052915968 net.cpp:96] Setting up conv4
I1221 11:59:32.042726 2052915968 net.cpp:103] Top shape: 50 384 13 13 (3244800)
I1221 11:59:32.042757 2052915968 net.cpp:67] Creating Layer relu4
I1221 11:59:32.042765 2052915968 net.cpp:394] relu4 <- conv4
I1221 11:59:32.042776 2052915968 net.cpp:345] relu4 -> conv4 (in-place)
I1221 11:59:32.042786 2052915968 net.cpp:96] Setting up relu4
I1221 11:59:32.042793 2052915968 net.cpp:103] Top shape: 50 384 13 13 (3244800)
I1221 11:59:32.042804 2052915968 net.cpp:67] Creating Layer conv5
I1221 11:59:32.042810 2052915968 net.cpp:394] conv5 <- conv4
I1221 11:59:32.042820 2052915968 net.cpp:356] conv5 -> conv5
I1221 11:59:32.042832 2052915968 net.cpp:96] Setting up conv5
I1221 11:59:32.051954 2052915968 net.cpp:103] Top shape: 50 256 13 13 (2163200)
I1221 11:59:32.051982 2052915968 net.cpp:67] Creating Layer relu5
I1221 11:59:32.051990 2052915968 net.cpp:394] relu5 <- conv5
I1221 11:59:32.052000 2052915968 net.cpp:345] relu5 -> conv5 (in-place)
I1221 11:59:32.052011 2052915968 net.cpp:96] Setting up relu5
I1221 11:59:32.052017 2052915968 net.cpp:103] Top shape: 50 256 13 13 (2163200)
I1221 11:59:32.052027 2052915968 net.cpp:67] Creating Layer pool5
I1221 11:59:32.052033 2052915968 net.cpp:394] pool5 <- conv5
I1221 11:59:32.052043 2052915968 net.cpp:356] pool5 -> pool5
I1221 11:59:32.052053 2052915968 net.cpp:96] Setting up pool5
I1221 11:59:32.052063 2052915968 net.cpp:103] Top shape: 50 256 6 6 (460800)
I1221 11:59:32.052075 2052915968 net.cpp:67] Creating Layer fc6
I1221 11:59:32.052083 2052915968 net.cpp:394] fc6 <- pool5
I1221 11:59:32.052093 2052915968 net.cpp:356] fc6 -> fc6
I1221 11:59:32.052104 2052915968 net.cpp:96] Setting up fc6
I1221 11:59:32.753568 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:32.753633 2052915968 net.cpp:67] Creating Layer relu6
I1221 11:59:32.753720 2052915968 net.cpp:394] relu6 <- fc6
I1221 11:59:32.753739 2052915968 net.cpp:345] relu6 -> fc6 (in-place)
I1221 11:59:32.753752 2052915968 net.cpp:96] Setting up relu6
I1221 11:59:32.753761 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:32.753773 2052915968 net.cpp:67] Creating Layer drop6
I1221 11:59:32.753780 2052915968 net.cpp:394] drop6 <- fc6
I1221 11:59:32.753792 2052915968 net.cpp:345] drop6 -> fc6 (in-place)
I1221 11:59:32.753803 2052915968 net.cpp:96] Setting up drop6
I1221 11:59:32.753820 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:32.753880 2052915968 net.cpp:67] Creating Layer fc7
I1221 11:59:32.753890 2052915968 net.cpp:394] fc7 <- fc6
I1221 11:59:32.753901 2052915968 net.cpp:356] fc7 -> fc7
I1221 11:59:32.753914 2052915968 net.cpp:96] Setting up fc7
I1221 11:59:33.041023 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:33.041059 2052915968 net.cpp:67] Creating Layer relu7
I1221 11:59:33.041066 2052915968 net.cpp:394] relu7 <- fc7
I1221 11:59:33.041074 2052915968 net.cpp:345] relu7 -> fc7 (in-place)
I1221 11:59:33.041081 2052915968 net.cpp:96] Setting up relu7
I1221 11:59:33.041086 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:33.041100 2052915968 net.cpp:67] Creating Layer drop7
I1221 11:59:33.041103 2052915968 net.cpp:394] drop7 <- fc7
I1221 11:59:33.041108 2052915968 net.cpp:345] drop7 -> fc7 (in-place)
I1221 11:59:33.041115 2052915968 net.cpp:96] Setting up drop7
I1221 11:59:33.041120 2052915968 net.cpp:103] Top shape: 50 4096 1 1 (204800)
I1221 11:59:33.041127 2052915968 net.cpp:67] Creating Layer fc8_ndsb
I1221 11:59:33.041131 2052915968 net.cpp:394] fc8_ndsb <- fc7
I1221 11:59:33.041137 2052915968 net.cpp:356] fc8_ndsb -> fc8_ndsb
I1221 11:59:33.041146 2052915968 net.cpp:96] Setting up fc8_ndsb
I1221 11:59:33.041973 2052915968 net.cpp:103] Top shape: 50 10 1 1 (500)
I1221 11:59:33.042001 2052915968 net.cpp:67] Creating Layer loss
I1221 11:59:33.042008 2052915968 net.cpp:394] loss <- fc8_ndsb
I1221 11:59:33.042017 2052915968 net.cpp:394] loss <- label
I1221 11:59:33.042028 2052915968 net.cpp:356] loss -> (automatic)
I1221 11:59:33.042037 2052915968 net.cpp:96] Setting up loss
I1221 11:59:33.042058 2052915968 net.cpp:103] Top shape: 1 1 1 1 (1)
I1221 11:59:33.042073 2052915968 net.cpp:109]     with loss weight 1
I1221 11:59:33.042091 2052915968 net.cpp:170] loss needs backward computation.
I1221 11:59:33.042099 2052915968 net.cpp:170] fc8_ndsb needs backward computation.
I1221 11:59:33.042106 2052915968 net.cpp:170] drop7 needs backward computation.
I1221 11:59:33.042114 2052915968 net.cpp:170] relu7 needs backward computation.
I1221 11:59:33.042120 2052915968 net.cpp:170] fc7 needs backward computation.
I1221 11:59:33.042127 2052915968 net.cpp:170] drop6 needs backward computation.
I1221 11:59:33.042135 2052915968 net.cpp:170] relu6 needs backward computation.
I1221 11:59:33.042142 2052915968 net.cpp:170] fc6 needs backward computation.
I1221 11:59:33.042150 2052915968 net.cpp:170] pool5 needs backward computation.
I1221 11:59:33.042157 2052915968 net.cpp:170] relu5 needs backward computation.
I1221 11:59:33.042165 2052915968 net.cpp:170] conv5 needs backward computation.
I1221 11:59:33.042171 2052915968 net.cpp:170] relu4 needs backward computation.
I1221 11:59:33.042179 2052915968 net.cpp:170] conv4 needs backward computation.
I1221 11:59:33.042186 2052915968 net.cpp:170] relu3 needs backward computation.
I1221 11:59:33.042193 2052915968 net.cpp:170] conv3 needs backward computation.
I1221 11:59:33.042201 2052915968 net.cpp:170] norm2 needs backward computation.
I1221 11:59:33.042209 2052915968 net.cpp:170] pool2 needs backward computation.
I1221 11:59:33.042217 2052915968 net.cpp:170] relu2 needs backward computation.
I1221 11:59:33.042223 2052915968 net.cpp:170] conv2 needs backward computation.
I1221 11:59:33.042230 2052915968 net.cpp:170] norm1 needs backward computation.
I1221 11:59:33.042238 2052915968 net.cpp:170] pool1 needs backward computation.
I1221 11:59:33.042245 2052915968 net.cpp:170] relu1 needs backward computation.
I1221 11:59:33.042253 2052915968 net.cpp:170] conv1 needs backward computation.
I1221 11:59:33.042260 2052915968 net.cpp:172] data does not need backward computation.
I1221 11:59:33.042281 2052915968 net.cpp:467] Collecting Learning Rate and Weight Decay.
I1221 11:59:33.042294 2052915968 net.cpp:219] Network initialization done.
I1221 11:59:33.042301 2052915968 net.cpp:220] Memory required for data: 343009204
I1221 11:59:33.043426 2052915968 solver.cpp:151] Creating test net (#0) specified by net file: models/ndsb/train.prototxt
I1221 11:59:33.043509 2052915968 net.cpp:275] The NetState phase (1) differed from the phase (0) specified by a rule in layer data
I1221 11:59:33.043544 2052915968 net.cpp:39] Initializing net from parameters: 
name: "NDSB"
layers {
  top: "data"
  top: "label"
  name: "data"
  type: DATA
  image_data_param {
    source: "data/ndsb/test_small_file_list.txt"
    batch_size: 50
    new_height: 256
    new_width: 256
  }
  include {
    phase: TEST
  }
  transform_param {
    mirror: false
    crop_size: 227
    mean_file: "data/ilsvrc12/imagenet_mean.binaryproto"
  }
}
layers {
  bottom: "data"
  top: "conv1"
  name: "conv1"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 96
    kernel_size: 11
    stride: 4
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "conv1"
  top: "conv1"
  name: "relu1"
  type: RELU
}
layers {
  bottom: "conv1"
  top: "pool1"
  name: "pool1"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool1"
  top: "norm1"
  name: "norm1"
  type: LRN
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layers {
  bottom: "norm1"
  top: "conv2"
  name: "conv2"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 256
    pad: 2
    kernel_size: 5
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv2"
  top: "conv2"
  name: "relu2"
  type: RELU
}
layers {
  bottom: "conv2"
  top: "pool2"
  name: "pool2"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool2"
  top: "norm2"
  name: "norm2"
  type: LRN
  lrn_param {
    local_size: 5
    alpha: 0.0001
    beta: 0.75
  }
}
layers {
  bottom: "norm2"
  top: "conv3"
  name: "conv3"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "conv3"
  top: "conv3"
  name: "relu3"
  type: RELU
}
layers {
  bottom: "conv3"
  top: "conv4"
  name: "conv4"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 384
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv4"
  top: "conv4"
  name: "relu4"
  type: RELU
}
layers {
  bottom: "conv4"
  top: "conv5"
  name: "conv5"
  type: CONVOLUTION
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  convolution_param {
    num_output: 256
    pad: 1
    kernel_size: 3
    group: 2
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "conv5"
  top: "conv5"
  name: "relu5"
  type: RELU
}
layers {
  bottom: "conv5"
  top: "pool5"
  name: "pool5"
  type: POOLING
  pooling_param {
    pool: MAX
    kernel_size: 3
    stride: 2
  }
}
layers {
  bottom: "pool5"
  top: "fc6"
  name: "fc6"
  type: INNER_PRODUCT
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "fc6"
  top: "fc6"
  name: "relu6"
  type: RELU
}
layers {
  bottom: "fc6"
  top: "fc6"
  name: "drop6"
  type: DROPOUT
  dropout_param {
    dropout_ratio: 0.5
  }
}
layers {
  bottom: "fc6"
  top: "fc7"
  name: "fc7"
  type: INNER_PRODUCT
  blobs_lr: 1
  blobs_lr: 2
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 4096
    weight_filler {
      type: "gaussian"
      std: 0.005
    }
    bias_filler {
      type: "constant"
      value: 1
    }
  }
}
layers {
  bottom: "fc7"
  top: "fc7"
  name: "relu7"
  type: RELU
}
layers {
  bottom: "fc7"
  top: "fc7"
  name: "drop7"
  type: DROPOUT
  dropout_param {
    dropout_ratio: 0.5
  }
}
layers {
  bottom: "fc7"
  top: "fc8_ndsb"
  name: "fc8_ndsb"
  type: INNER_PRODUCT
  blobs_lr: 10
  blobs_lr: 20
  weight_decay: 1
  weight_decay: 0
  inner_product_param {
    num_output: 10
    weight_filler {
      type: "gaussian"
      std: 0.01
    }
    bias_filler {
      type: "constant"
      value: 0
    }
  }
}
layers {
  bottom: "fc8_ndsb"
  bottom: "label"
  name: "loss"
  type: SOFTMAX_LOSS
}
layers {
  bottom: "fc8_ndsb"
  bottom: "label"
  top: "accuracy"
  name: "accuracy"
  type: ACCURACY
  include {
    phase: TEST
  }
}
state {
  phase: TEST
}
I1221 11:59:33.044486 2052915968 net.cpp:67] Creating Layer data
I1221 11:59:33.044700 2052915968 net.cpp:356] data -> data
I1221 11:59:33.044728 2052915968 net.cpp:356] data -> label
I1221 11:59:33.044744 2052915968 net.cpp:96] Setting up data
I1221 11:59:33.044795 2052915968 data_layer.cpp:45] Opening leveldb 
F1221 11:59:33.045028 2052915968 data_layer.cpp:48] Check failed: status.ok() Failed to open leveldb 
IO error: /LOCK: Permission denied
*** Check failure stack trace: ***
    @        0x101920a6a  google::LogMessage::Fail()
    @        0x10191fc88  google::LogMessage::SendToLog()
    @        0x1019206fa  google::LogMessage::Flush()
    @        0x1019241e8  google::LogMessageFatal::~LogMessageFatal()
    @        0x101920f05  google::LogMessageFatal::~LogMessageFatal()
    @        0x1017bf4b7  caffe::DataLayer<>::DataLayerSetUp()
    @        0x1017b579e  caffe::BaseDataLayer<>::LayerSetUp()
    @        0x1017b6ede  caffe::BasePrefetchingDataLayer<>::LayerSetUp()
    @        0x1017f24fe  caffe::Net<>::Init()
    @        0x1017f178b  caffe::Net<>::Net()
    @        0x101809185  caffe::Solver<>::InitTestNets()
    @        0x101807f57  caffe::Solver<>::Init()
    @        0x101807dfc  caffe::Solver<>::Solver()
    @        0x10175fe60  caffe::GetSolver<>()
    @        0x10175d83e  train()
    @        0x10175f871  main
    @     0x7fff9778b5c9  start
    @                0x6  (unknown)
zsh: abort      build/tools/caffe train -solver models/ndsb/solver.prototxt -weights 

Most helpful comment

I found my error -- for the data layer in the test phase, I accidentally had type: DATA instead of type: IMAGE_DATA. Then because there was an image_data_param -> source but no data_param block, it's failing with the error above. In addition to the suggestions above, it'd be helpful if the layer type were checked against the parameters given; if image_data_param is in the layer and the layer type is not IMAGE_DATA, a warning or error could be emitted.

All 5 comments

I found my error -- for the data layer in the test phase, I accidentally had type: DATA instead of type: IMAGE_DATA. Then because there was an image_data_param -> source but no data_param block, it's failing with the error above. In addition to the suggestions above, it'd be helpful if the layer type were checked against the parameters given; if image_data_param is in the layer and the layer type is not IMAGE_DATA, a warning or error could be emitted.

Yes, more exhaustive checks of model definitions could be done but it is difficult to catch every case without adding bloat to the code. Thanks for following-up when you realized your problem.

I ran into exactly the same problem as OP and the error message doesn't help pinpoint the problem (other than googling it, which thankfully led me here).

If it improves the user experience, it's not "bloat". It's important code. :)

I think that it's possible to do a reasonable set of checks without catching every case (or using a type system) and not bloating the code, especially if the checks are done before running instead of during. This would have the added advantage of catching common errors before a very long training cycle instead of during or after.

Another way to approach this is, "Don't try to anticipate 'every case' upfront (beyond the common sense cases), but do add code for situations where people were required to open an issue (because it will prevent anyone from needing to open it again)."

Was this page helpful?
0 / 5 - 0 ratings