I have to make a simple 3 layer neural network in Matlab (2-10-2).
I have worked on Convolution Neural Network in Matlab and want to compare that with simple neural network architecture.
I have 14000 images of each class and there are two classes at the input and two classes will be at the output. Image size at the input in 56x56=3136.
1) How to make 2-10-2 NN architecture.
2) Also the images i have are RGB so its 56x56x3
so input matrix will be 9408?Regarding in the input x
if two classes. for each class x1 will have the size 3161x700 and x2 will have the size 9408x700, the x
input will have the final size of 9408x1400
and label will be 1x1400
?
If you look at the feedforwardnet MATLAB help page there is this example:
[x,t] = simplefit_dataset;
net = feedforwardnet(10);
net = train(net,x,t);
view(net)
y = net(x);
perf = perform(net,y,t)
This is almost what you want. The feedforwardnet can take an array of different hidden layer sizes, so we can do:
net = feedforwardnet([2 10 2]);
to get the architecture you want. You don't need to worry about the input layer size or output layer sizes. Those are set to '0' and automatically set to the right size based on the inputs and outputs you provide to the network (net
in the example) during training. In your case, you can reshape your 56x56 matrix into a 3136x1 vector:
x = reshape(x,3161,1);
So, following the above example, make sure your class/target labels are in t
and your corresponding inputs in x
and you're good to go.
That being said, I would not use one of these networks to classify images. ConvNets are generally the way to go.
To split the input data (x and t) into training, validation and test sets and have the training function automatically take care of generalization ability like that, do this before training:
net.divideFcn = 'dividerand';
net.divideParam.trainRatio = 0.7;
net.divideParam.valRatio = 0.15;
net.divideParam.testRatio = 0.15;
Putting it together, we have:
[x,t] = simplefit_dataset;
net = feedforwardnet(10);
net.divideFcn = 'dividerand';
net.divideParam.trainRatio = 0.7;
net.divideParam.valRatio = 0.15;
net.divideParam.testRatio = 0.15;
net = train(net,x,t);
view(net)
y = net(x);
perf = perform(net,y,t)
For 2 class classification problem single output logistic neuron is enough.
NN architecture of (2-10-1) would be enough.