I'm trying to ray.put a Keras model in the context of running DQN with a bunch of experience replay rollouts gathered in parallel via Ray.
import ray
from keras.layers import Dense
from keras.models import Sequential
from keras.optimizers import SGD
Q = Sequential()
Q.add(Dense(1, input_shape=(1, )))
Q.compile(optimizer=SGD(), loss="mse", metrics=["acc"])
ray.init()
Q_id = ray.put(Q)
I get this traceback:
WARNING: Serializing objects of type <class 'keras.engine.sequential.Sequential'> by expanding them as d
ictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.optimizers.Adam'> by expanding them as dictionaries o
f their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'tensorflow.python.framework.ops.Tensor'> by expanding them
as dictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.engine.input_layer.InputLayer'> by expanding them as
dictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.layers.core.Dense'> by expanding them as dictionaries
of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.engine.base_layer.Node'> by expanding them as diction
aries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'tensorflow.python.ops.variables.Variable'> by expanding the
m as dictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.engine.base_layer.InputSpec'> by expanding them as di
ctionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.initializers.VarianceScaling'> by expanding them as d
ictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'keras.initializers.Zeros'> by expanding them as dictionarie
s of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'tensorflow.python.framework.ops.Operation'> by expanding th
em as dictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'tensorflow.python.framework.dtypes.DType'> by expanding the
m as dictionaries of their fields. This behavior may be incorrect in some cases.
WARNING: Serializing objects of type <class 'tensorflow.python.framework.tensor_shape.TensorShape'> by e
xpanding them as dictionaries of their fields. This behavior may be incorrect in some cases.
Traceback (most recent call last):
File "fail.py", line 13, in <module>
Q_id = ray.put(Q)
File "/Users/alokbeniwal/.local/share/virtualenvs/kdd--yfWOkvn/lib/python3.6/site-packages/ray/worker.
py", line 2798, in put
worker.put_object(object_id, value)
File "/Users/alokbeniwal/.local/share/virtualenvs/kdd--yfWOkvn/lib/python3.6/site-packages/ray/worker.
py", line 368, in put_object
self.store_and_register(object_id, value)
File "/Users/alokbeniwal/.local/share/virtualenvs/kdd--yfWOkvn/lib/python3.6/site-packages/ray/worker.
py", line 303, in store_and_register
serialization_context=self.serialization_context)
File "pyarrow/_plasma.pyx", line 395, in pyarrow._plasma.PlasmaClient.put
File "pyarrow/serialization.pxi", line 338, in pyarrow.lib.serialize
File "pyarrow/error.pxi", line 89, in pyarrow.lib.check_status
pyarrow.lib.ArrowNotImplementedError: This object exceeds the maximum recursion depth. It may contain it
self recursively.
Tested on a coworker's computer too with the same result.
Is this typically possible (ie, with Pickle)?
For tensorflow models we usually instantiate model replicas on separate actors, and then use model.get_weights or model.set_weights to move weights between them.
Pickle can't handle it (TypeError: can't pickle _thread.RLock objects). My bad for not checking that, I assumed Ray would handle Keras since it's a pretty common library.
It would probably make sense to have a series of examples on the docs on how to interact with different DL frameworks.
Closing this as this seems to be resolved.
Closing this as this seems to be resolved.
What does this mean? I get the same error on the same version of ray. Is it resolved on master?
It just means that instead of serializing a TF/Keras model you should instantiate model replicas on each actor and pass weights around.