Quick question: (tensorflow 1.4/Keras 2.2)
Is there any difference between tf.matmul and keras dot function?
Seems to me that the dot function needs a specific axis, while the matmul function only
needs the two matrices. Is there any other difference? Any examples?
Thanks!
They actually both work without specifying an axis. TensorFlow uses float64 by default whereas Keras uses float32. Using np.random.seed(42) and r = np.random.rand(4, 5),
x = tf.constant(r)
cov = tf.matmul(x, tf.transpose(x))
returns Tensor("MatMul:0", shape=(4, 4), dtype=float64)
x = K.constant(x_val)
cov = K.dot(x, K.transpose(x))
returns Tensor("MatMul_1:0", shape=(4, 4), dtype=float32)
Note that there is a variance of about 1e-7 between the two, as pointed out in https://stackoverflow.com/a/44100246/6647539
thanks for the info! Why I'm asking: my model takes the following:
input_T = Reshape((3, 3))(x)
# forward net
#K.batch_dot(x[0], BatchM, axes=[1,2])
g = K.batch_dot(input, input_T,axes=[1,2])
g = Convolution1D(64, 1, input_shape=(p.numPoints, 3), activation='relu')(g)
so I want to multiply input (28800,3) with input_T (3,3). But I'm receiving the error:
ValueError: Dimensions must be equal, but are 28800 and 3 for 'MatMul_3' (op: 'BatchMatMul') with input shapes: [?,28800,3], [?,3,3].
while
def cov_tf(x_val,y_val):
x = tf.constant(x_val)
y = tf.constant(y_val)
cov = tf.matmul(x, y)
return cov.eval(session=tf.Session())
def cov_keras(x_val, y_val):
x = K.constant(x_val)
y = K.constant(y_val)
cov = K.dot(x, y)
return cov.eval(session=tf.Session())
if __name__ == '__main__':
x = np.random.rand(2048, 3)
y = np.random.rand(3, 3)
print (x.shape)
print (cov_tf(x,y).shape)
print (cov_keras(x,y).shape)
delta = np.abs(cov_tf(x,y) - cov_keras(x,y)).max()
print('Maximum absolute difference:', delta)
works:
(28800, 3)
Higher dimensions?
a.shape # (125, 125, 1, 3)
b/shape # (125, 125, 3, 1)
np.matmul(a,b).shape # (125,125,1,1)
K.dot(ab).shape # (125, 125, 1, 125, 125, 1)
What is Keras' equivalent?
I ended up using tf.matmul importing tf
It doesn't matter, we are doing tensor operations and it's completely fine
I'm facing the same issue as @zekedran higher dimensions are a problem in Keras. Clarifying proper use would be appreciated.
Most helpful comment
I'm facing the same issue as @zekedran higher dimensions are a problem in Keras. Clarifying proper use would be appreciated.