Caffe, setting custom weights in layer

2019-07-11 13:49发布

I have a network. In one place I want to use concat. As on this picture. picture

Unfortunately, the network doesn't train. To understand why I want to change weights in concat. Meaning that all values from FC4096 will get 1 and all values from FC16000 will get 0 at the beginning.

I know that FC4096 will get me 57% accuracy, so with learning rate 10^-6 I will understand why after concatenation layers didn't learn.

The question is, how can I set all values from FC4096 to 1 and all values from FC16000 to 0?

1条回答
成全新的幸福
2楼-- · 2019-07-11 14:48

You can add a "Scale" layer on top of FC16000 and init it to 0:

layer {
  name: "scale16000"
  type: "Scale"
  bottom: "fc16000"
  top: "fc16000"  # not 100% sure this layer can work in-place, worth trying though.
  scale_param {
    bias_term: false
    filler: { type: "constant" value: 0 }
  }
  param { lr_mult: 0 decay_mult: 0 } # set mult to non zero if you want to train this scale
}
查看更多
登录 后发表回答