I recently tried using MemoryDataLayer for training in pycaffe. but I faced this error:
```
W0405 20:53:19.679730 4640 memory_data_layer.cpp:90] MemoryData does not transform array data on Reset()
I0405 20:53:19.713727 4640 solver.cpp:337] Iteration 0, Testing net (#0)
I0405 20:53:19.719229 4640 net.cpp:685] Ignoring source layer accuracy_training
F0405 20:53:19.719229 4640 memory_data_layer.cpp:110] Check failed: data_ MemoryDataLayer needs to be initalized by calling Reset
* Check failure stack trace: *
The Reset() method is accessible in the C++ api , however, in pycaffe I couldn't find any such method. I searched alot and there were many people who faced the same exact problem with no solution. They all seemed to abandon this altogether.
My question is, is this layer abandoned/deprecated or never intended to be used in pycaffe?
Should'nt there be atleast a documentation page specific to pycaffe? this is not even mentioned in Caffe's official documentation page related to MemoryDataLayer.
### Steps to reproduce
solver = caffe.get_solver('examples/testbed/solver.prototxt')
net = solver.net
...
net.set_input_arrays(batch, label_train)
```
Operating system: Windows 10 x64
Compiler: Visual Studio 2013/2015
CUDA version (if applicable): 8.0
CUDNN version (if applicable): 5.1
BLAS:
Python or MATLAB version (for pycaffe and matcaffe respectively): 2.7
Yeah it's discouraged to use the MemoryDataLayer in Python. Using it also transfers memory ownership from Python to C++ with the Boost bindings and therefore causes memory leaks. Memory will only be released after the network object is destructed in python. So if you're training a network for a long time, you'll run out of memory.
It's encouraged to use InputLayer instead, where you can just assign data from a numpy array into the memory blobs.
This should certainly at least be documented, if not fixed. This issue can serve as a reminder.
It is essentially deprecated. Closing according to #5528.
Most helpful comment
This should certainly at least be documented, if not fixed. This issue can serve as a reminder.