Currently there appears to be no official support for models having multiple input layers in the Scikit-Learn wrappers:
https://keras.io/scikit-learn-api/
You can use Sequential Keras models (single-input only) as part of your Scikit-Learn workflow via the wrappers found at keras.wrappers.scikit_learn.py.
This has been referenced in issue #6451, but it would be very helpful for there to be support for models with multiple inputs so that they can be easily used with GridSearchCV from sklearn.
[x] Check that you are up-to-date with the master branch of Keras. You can update with:
pip install git+git://github.com/keras-team/keras.git --upgrade --no-deps
[x] If running on TensorFlow, check that you are up-to-date with the latest version. The installation instructions can be found here.
[ ] If running on Theano, check that you are up-to-date with the master branch of Theano. You can update with:
pip install git+git://github.com/Theano/Theano.git --upgrade --no-deps
[x] Provide a link to a GitHub Gist of a Python script that can reproduce your issue (or just copy the script here if it is short).
Is there any possible work-around till the issue is resolved?
@adityagujral In my own project, I've been using these scrappily-put-together, not-well-tested classes:
from sklearn.utils.metaestimators import _BaseComposition
from sklearn.base import TransformerMixin
class KerasInputFormatter(_BaseComposition, TransformerMixin):
def __init__(self, transformers):
self.transformers = transformers
def fit(self, X, y = None):
for n, t in self.transformers:
t.fit(X, y)
return self
def transform(self, X):
return { n: t.transform(X) for n, t in self.transformers }
def get_params(self, deep=True):
return self._get_params('transformers', deep=deep)
def set_params(self, **kwargs):
self._set_params('transformers', **kwargs)
return self
If you give it a list of transformers, it acts very much like a feature union:
keras_input = KerasInputFormatter([
('input_one', input_one_transformer),
('input_two', input_two_transformer),
])
When you call .transform, however, it yields a python dictionary containing the individual inputs, as keras expects.
After that, I've been manually wrapping multi-input keras models:
from sklearn.base import BaseEstimator
from keras.models import Model
from keras.layers import Input, ...
class KerasModel(BaseEstimator):
def __init__(self, optimizer = 'sgd'):
self.optimizer = optimizer # an example of a tunable hyperparam
def fit(self, X, y):
input_one = Input(name = 'input_one', shape = X['input_one'].shape[1:])
input_two = Input(name = 'input_two', shape = X['input_two'].shape[1:])
output = ... # define model here
self.model = Model(inputs = [input_one, input_two], outputs = output)
self.model.compile(self.optimizer, 'mse')
self.model.fit(X, y)
return self
def predict(self, X):
return self.model.predict(X)
So in a pipeline:
make_pipeline(keras_input, KerasModel())
caveat: It assumes that .fit on each of the transformers mutates the transformer itself. That does seem to be the defacto standard in sklearn, but I can't find it written anywhere. You may also want to inherit from RegressorMixin or ClassifierMixin for your KerasModel if your code relies on those interfaces. RandomSearchCV seems to be perfectly happy with the above, however.
Also the same as closed issue #2748.
I would very much like this functionality.
Also have an interest in this, and would like to know if anybody has done relevant work in a forked keras or keras-contrib.
I have the same issue. any solution on this?
This issue is critical. Especially when using Siamese Network.
Facing the same Issue in multi-modal models.
Here a more simple workaround: https://stackoverflow.com/questions/56824968/grid-search-for-keras-with-multiple-inputs/62512554#62512554
Most helpful comment
@adityagujral In my own project, I've been using these scrappily-put-together, not-well-tested classes:
If you give it a list of transformers, it acts very much like a feature union:
When you call
.transform, however, it yields a python dictionary containing the individual inputs, as keras expects.After that, I've been manually wrapping multi-input keras models:
So in a pipeline:
caveat: It assumes that
.fiton each of the transformers mutates the transformer itself. That does seem to be the defacto standard in sklearn, but I can't find it written anywhere. You may also want to inherit fromRegressorMixinorClassifierMixinfor yourKerasModelif your code relies on those interfaces.RandomSearchCVseems to be perfectly happy with the above, however.