Hi I would like to know how to use the SPP Layer in a proto file. Maybe someone could explain to me how to read the caffe docs, as it is sometimes hard for me to understand it directly.
My attempt is based on this protofile, but I think it differs from the current version?
I defined the layer like this:
layers {
name: "spatial_pyramid_pooling"
type: "SPP"
bottom: "conv2"
top: "spatial_pyramid_pooling"
spatial_pyramid_pooling_param {
pool: MAX
spatial_bin: 1
spatial_bin: 2
spatial_bin: 3
spatial_bin: 6
scale: 1
}
}
When I try to start learning I get the following error message:
[libprotobuf ERROR google/protobuf/text_format.cc:287] Error parsing text-format caffe.NetParameter: 137:9: Expected integer or identifier, got: "SPP"
F0714 13:25:38.782958 2061316096 upgrade_proto.cpp:88] Check failed: ReadProtoFromTextFile(param_file, param) Failed to parse NetParameter file:
Full proto file (Lenet with batch batch normalization and SPP):
name: "TessDigitMean"
layer {
name: "input"
type: "Data"
top: "data"
top: "label"
include {
phase: TRAIN
}
transform_param {
scale: 0.00390625
}
data_param {
source: "/Users/rvaldez/Documents/Datasets/Digits/SeperatedProviderV3_1020_batchnormalizedV2AndSPP/1/caffe/train_lmdb"
batch_size: 64
backend: LMDB
}
}
layer {
name: "input"
type: "Data"
top: "data"
top: "label"
include {
phase: TEST
}
transform_param {
scale: 0.00390625
}
data_param {
source: "/Users/rvaldez/Documents/Datasets/Digits/SeperatedProviderV3_1020_batchnormalizedV2AndSPP/1/caffe/test_lmdb"
batch_size: 10
backend: LMDB
}
}
layer {
name: "conv1"
type: "Convolution"
bottom: "data"
top: "conv1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 20
kernel_size: 5
stride: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "pool1"
type: "Pooling"
bottom: "conv1"
top: "pool1"
pooling_param {
pool: MAX
kernel_size: 2
stride: 2
}
}
layer {
name: "bn1"
type: "BatchNorm"
bottom: "pool1"
top: "bn1"
batch_norm_param {
use_global_stats: false
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
include {
phase: TRAIN
}
}
layer {
name: "bn1"
type: "BatchNorm"
bottom: "pool1"
top: "bn1"
batch_norm_param {
use_global_stats: true
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
include {
phase: TEST
}
}
layer {
name: "conv2"
type: "Convolution"
bottom: "bn1"
top: "conv2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
convolution_param {
num_output: 50
kernel_size: 5
stride: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layers {
name: "spatial_pyramid_pooling"
type: "SPP"
bottom: "conv2"
top: "spatial_pyramid_pooling"
spatial_pyramid_pooling_param {
pool: MAX
spatial_bin: 1
spatial_bin: 2
spatial_bin: 3
spatial_bin: 6
scale: 1
}
}
layer {
name: "bn2"
type: "BatchNorm"
bottom: "spatial_pyramid_pooling"
top: "bn2"
batch_norm_param {
use_global_stats: false
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
include {
phase: TRAIN
}
}
layer {
name: "bn2"
type: "BatchNorm"
bottom: "pool2"
top: "bn2"
batch_norm_param {
use_global_stats: true
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
param {
lr_mult: 0
}
include {
phase: TEST
}
}
layer {
name: "ip1"
type: "InnerProduct"
bottom: "bn2"
top: "ip1"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 500
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "relu1"
type: "ReLU"
bottom: "ip1"
top: "ip1"
}
layer {
name: "ip2"
type: "InnerProduct"
bottom: "ip1"
top: "ip2"
param {
lr_mult: 1
}
param {
lr_mult: 2
}
inner_product_param {
num_output: 10
weight_filler {
type: "xavier"
}
bias_filler {
type: "constant"
}
}
}
layer {
name: "accuracy"
type: "Accuracy"
bottom: "ip2"
bottom: "label"
top: "accuracy"
include {
phase: TEST
}
}
layer {
name: "loss"
type: "SoftmaxWithLoss"
bottom: "ip2"
bottom: "label"
top: "loss"
}
Ok, I found it.
The correct way to define a SPP Layer is like this:
Note that I had previously written
layers
instead oflayer
. Furthermore you can specify parameters for this layer insidespp_param{}
. The official version of caffe does not have bins as an option, but instead a pyramid height. So the version my first try was based on, is incorrect.Some notes for myself and anyone who is new to caffe and is a bit confused by the style of the docs.
Docs:
...
Notes:
Layer type defines the key word to declare the type of a layer in proto file (kind of logical if you know it)
Enums
in this definition are possible values forparameter
.Parameters can not be defined on the same level as type or name. Instead you have to wrap it inside an layerspecifc parameter keyword (
spp_param
). This keyword is build like this<layertype>_param{}
in lowercase letters.