How do I implement the optimization function in te

2019-03-04 08:09发布

minΣ(||xi-Xci||^2+ λ||ci||),

s.t cii = 0,

where X is a matrix of shape d * n and C is of the shape n * n, xi and ci means a column of X and C separately.

X is known here and based on X we want to find C.

1条回答
forever°为你锁心
2楼-- · 2019-03-04 08:31

Usually with a loss like that you need to vectorize it, instead of working with columns:

loss = X - tf.matmul(X, C)
loss = tf.reduce_sum(tf.square(loss))

reg_loss = tf.reduce_sum(tf.square(C), 0)  # L2 loss for each column
reg_loss = tf.reduce_sum(tf.sqrt(reg_loss))

total_loss = loss + lambd * reg_loss

To implement the zero constraint on the diagonal of C, the best way is to add it to the loss with another constant lambd2:

reg_loss2 = tf.trace(tf.square(C))
total_loss = total_loss + lambd2 * reg_loss2
查看更多
登录 后发表回答