for i in xrange( len( X1 ) ): X[i,:] = X1[i,:] - X2
However, the rule of broadcasting is, when
1. the dimensions are equal
2. one of the dimensions is 1
dimensions with size 1 are stretched or “copied” to match the other.
A = np.random.rand(8,1,6,1)
B = np.random.rand(7,1,5)
print (A*B).shape #=(8, 7, 6, 5)
theano variables and shared variables
By default, tensor matrix/tensor3/tensor4 are not broadcastable by default. theano’s all shared variable dimensions aren’t broadcastable either, as their shape could change.
b = theano.shared(bval, broadcastable=True)
but it does not work for me.
Instead, I tried
b = T.addbroadcast(b, 1)
and it works.
We can also use T.TensorType to set broadcastable
tensor.TensorType('float32', broadcastable = [False,True])
vector broadcasting in theano
vector can be broadcasted in the computing with matrix
xval = np.array([[1, 2, 3], [4, 5, 6]])
bval = np.array([10, 20, 30])
xshared = theano.shared(xval)
bshared = theano.shared(bval)
zvar = xshared *1.0 / bshared
- vector can be broadcasted only when #columns matches
- we can use “b.dimshuffle(0,’x’)” or “c2 = b[:,np.newaxis]” to change a vector into 2d array with shape = [d,1]