I am a newbie in Keras. My topmost layer is a _(n,1)_ vector, and I am trying to get all of its 2-way interactions. I attempt to do so by multiplying the layer by its transpose using a Lambda layer:
Lambda(lambda x: K.(dot(K.transpose(x), x)), output_shape =(n, n))
Unfortunately, this doesn't work because at runtime (surprisingly?) it seems that the first dimension of my layer are the datapoints.
I am working with a Sequential model, but I would be happy to change to a functional model if that helps. I can send more code if it's helpful.
Thank you for your patience and help.
try (n, n) -> (, n, n)
it is an interesting idea to measure the interaciton on the first order of two vector. Thus you assume some knowledge into the model, I assume two independent object has n features that might interact effect on the final output.
But why don't you put them into one 2N vector, maybe because of the dof is too large and you don't want to train? Usually if your DOF is too large compared to your available datapoints, you never gonna make the trainning process goes well
Thank you for your suggestion @pswpswpsw . Sadly, it didn't work.
The snippet I shared executes, but doesn't return the correct value. If I do your suggestion, the code gives a syntax error.
(RE: your question, notice that I'm trying to get the _n^2_ interactions, not the 2n... I'm working on an implementation of matrix factorization :) )
I couldn't do it with Keras' functions because I wasn't able to specify axes correctly using K.dot, and K.transpose doesn't take it as an argument. If anybody cares, this is the solution:
def multiply(x,n):
x_prime = tf.reshape(x, (-1, n, 1))
x_transpose = tf.transpose(x_prime, perm=[0,2, 1])
return tf.batch_matmul(x_transpose,x_prime)
Lambda(lambda x: multiply(x, n), output_shape =(n, n))
Have you tried with K.batch_dot?
I tried, but I wasn't able to make it work :( ...
Assuming input tensor shape is (batch_size, n, 1), required tensor shape is (batch_size, n, n)
Lambda(lambda x: K.batch_dot(x, x, axes=(2, 2)), output_shape=lambda s: (s[0], s[1], s[1]))
P.S not tested
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs, but feel free to re-open it if needed.
Most helpful comment
I couldn't do it with Keras' functions because I wasn't able to specify axes correctly using K.dot, and K.transpose doesn't take it as an argument. If anybody cares, this is the solution: