I have two Numpy arrays x
with shape (m, i)
and y
with shape (m, j)
(so the number of rows is the same). I would like to multiply each column of x
with each column of y
element-wise so that the result is of shape (m, i*j)
.
Example:
import numpy as np
np.random.seed(1)
x = np.random.randint(0, 2, (10, 3))
y = np.random.randint(0, 2, (10, 2))
This creates the following two arrays x
:
array([[1, 1, 0],
[0, 1, 1],
[1, 1, 1],
[0, 0, 1],
[0, 1, 1],
[0, 0, 1],
[0, 0, 0],
[1, 0, 0],
[1, 0, 0],
[0, 1, 0]])
and y
:
array([[0, 0],
[1, 1],
[1, 1],
[1, 0],
[0, 0],
[1, 1],
[1, 1],
[1, 1],
[0, 1],
[1, 0]])
Now the result should be:
array([[0, 0, 0, 0, 0, 0],
[0, 0, 1, 1, 1, 1],
[1, 1, 1, 1, 1, 1],
[0, 0, 0, 0, 1, 0],
[0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 1, 1],
[0, 0, 0, 0, 0, 0],
[1, 1, 0, 0, 0, 0],
[0, 1, 0, 0, 0, 0],
[0, 0, 1, 0, 0, 0]])
Currently, I've implemented this operation with two nested loops over the columns of x
and y
:
def _mult(x, y):
r = []
for xc in x.T:
for yc in y.T:
r.append(xc * yc)
return np.array(r).T
However, I'm pretty sure that there must be a more elegant solution that I can't seem to come up with.