I have the issue when i train my custom data in windows 10, it will show the error " Error in load_data_detection() - OpenCV " when loading the image. And if i just test data using pretrained-weight not training, it will correctly detect the result.
And i try different versions of opencv, including 3.4.5 3.4.10 4.1.0, all of them are failed. My cuda version is 10.0 and vs2017.
How to resolve this problem that " Error in load_data_detection() - OpenCV " when training custom data?
HI, @asdsdlgl did you manage this problem. I am also stucked here... please help... All my images and their annotations are are in the correct path, but still don't know why its happening.. @AlexeyAB
I am using this command darknet.exe detector train data/drone.data cfg/yolov4-tiny_drone.cfg yolov4-tiny.conv.29 and results are like this

I have the issue when i train my custom data in windows 10, it will show the error " Error in load_data_detection() - OpenCV " when loading the image. And if i just test data using pretrained-weight not training, it will correctly detect the result.
And i try different versions of opencv, including 3.4.5 3.4.10 4.1.0, all of them are failed. My cuda version is 10.0 and vs2017.
How to resolve this problem that " Error in load_data_detection() - OpenCV " when training custom data?
Hi , I am also stuck in this error anyone have a solution?
OK, the error is resolved when I again generate the train and valid.txt files by writing the absolute path of all image files. Here is the code:
from os import listdir
from os.path import isfile, join
customPath = 'Your/abs/path/'
onlyfiles = [f for f in listdir(customPath + '/DET-val/data/') if isfile(join(customPath + '/DET-val/data',f))]
trainFile = customPath + "/val.lists"
file = open(trainFile, 'w')
counter = 0
customPath = 'Your/abs/path/'
for eachFile in onlyfiles:
if "jpg" in eachFile:
counter+=1
file.write(customPath + eachFile + "\n")
print(counter)
OK, the error is resolved when I again generate the train and valid.txt files by writing the absolute path of all image files. Here is the code:
from os import listdir from os.path import isfile, join customPath = 'Your/abs/path/' onlyfiles = [f for f in listdir(customPath + '/DET-val/data/') if isfile(join(customPath + '/DET-val/data',f))] trainFile = customPath + "/val.lists" file = open(trainFile, 'w') counter = 0 customPath = 'Your/abs/path/' for eachFile in onlyfiles: if "jpg" in eachFile: counter+=1 file.write(customPath + eachFile + "\n") print(counter)
hi! i am a newbie, even i am getting the same error and i use the below code for generating train and test files could you help me by making changes to below code:
import os
import random
imgspath = 'C:/yolo_v4/yolo_v4_mask_detection/darknet/build/darknet/x64/data/obj'
path = 'data/obj/'
images = []
for i in os.listdir(imgspath):
temp = path+i
images.append(temp)
# train and test split... adjust it if necessary
trainlen = round(len(images)*.80)
testlen = round(len(images)*.20)
#print('total, train, test dataset size -',trainlen+testlen,trainlen,testlen)
random.shuffle(images)
test = images[:testlen]
train = images[testlen:]
with open('train.txt', 'w') as f:
for item in train:
f.write("%s\n" % item)
with open('test.txt', 'w') as f:
for item in test:
f.write("%s\n" % item)
@sharoseali
I have the same problem couldn't solve yet. Could anyone help me please.
I have the same problem here. All i did is DOUBLE CHECK the path or the images
Hi I had the same problem and I solved it by cloning darknet into a different directory (and compiling, and executing). I'm using c:\darknet as my base directory and now everything works. In my case I suspect it was related to my directories having spaces in the names.
Also I rechecked that all directories were marked with a \ and not \ or /.
I hope it may help.
because it's no data
Most helpful comment
Hi I had the same problem and I solved it by cloning darknet into a different directory (and compiling, and executing). I'm using c:\darknet as my base directory and now everything works. In my case I suspect it was related to my directories having spaces in the names.
Also I rechecked that all directories were marked with a \ and not \ or /.
I hope it may help.