Trying to train LeNet on my own dataset. I generated HDF5 file from my long 1D vectordata set and created HDF5 data layer as follows: I named the top blobs same as I did when I generate my HDF5.
name: "Test_net"
layer {
name: "data"
type: "HDF5Data"
top: "Inputdata"
top: "label"
hdf5_data_param {
source:"~/*_hdf5_train.txt"
batch_size: 32
}
include{phase: TRAIN}
}
layer {
name: "data2"
type: "HDF5Data"
top: "Inputdata"
top: "label"
hdf5_data_param {
source:"~/*_hdf5_test.txt"
batch_size: 32
}
include{phase: TEST}
}
layer {
name: "conv1"
type: "convolution"
bottom: "data"
top: "conv1"
param {lr_mult:1}
param {lr_mult:2}
convolution_param{
num_output: 20
kernel_h: 1
kernel_w: 5
stride_h: 1
stride_w: 1
weight_filler {
type: "xavier"
}
bias_filler {
type: "xavier"
}
}
}
layer {
name: "pool1"
type: "pooling"
bottom: "conv1"
top: "pool1"
pooling_param{
pool: MAX
kernel_h: 1
kernel_w: 2
stride_h: 1
stride_w: 2
}
}
# more layers here...
layer{
name: "loss"
type: "SigmoidCrossEntropyLoss"
bottom: "ip2"
bottom: "label"
top: "loss"
}
But then when I tried to train I am having the following error from insert_split.cpp
.
insert_splits.cpp:29] Unknown bottom blob 'data' (layer 'conv1', bottom index 0) *** Check failure stack trace: *** @ 0x7f19d7e735cd google::LogMessage::Fail() @ 0x7f19d7e75433 google::LogMessage::SendToLog() @ 0x7f19d7e7315b google::LogMessage::Flush() @ 0x7f19d7e75e1e google::LogMessageFatal::~LogMessageFatal() @ 0x7f19d82684dc caffe::InsertSplits() @ 0x7f19d8230d5e caffe::Net<>::Init() @ 0x7f19d8233f21 caffe::Net<>::Net() @ 0x7f19d829c68a caffe::Solver<>::InitTrainNet() @ 0x7f19d829d9f7 caffe::Solver<>::Init() @ 0x7f19d829dd9a caffe::Solver<>::Solver() @ 0x7f19d8211683 caffe::Creator_SGDSolver<>() @ 0x40a6c9 train() @ 0x4071c0 main @ 0x7f19d6dc8830 __libc_start_main @ 0x4079e9 _start @ (nil) (unknown) Aborted (core dumped)
What did I do wrong?
Cheers,
Your data layer outputs two "blobs":
"label"
and"Inputdata"
. Your"conv1"
layer expects as input a "blob" named"data"
. Caffe does not know that you meant"Inputdata"
and"data"
to be the same blob...Now, since you already saved the hdf5 files with
"Inputdata"
name, you cannot change this name in the"HDF5Data"
layer, what you can do is change"data"
to"Inputdata"
in the "bottom" of"conv1"
layer.PS,
Your loss layer requires two "bottom"s:
ip2
andlabel
you forgot to feed.