Suppose I have very big train set so that Matlab hangs while training or there is insufficient memory to hold train set.
Is it possible to split the training set into parts and train the network by parts?
Is it possible to train the network with one sample at a time (one by one)?
You can just manually divide dataset into batches and train them one after one:
for bn = 1:num_batches
inputs = <get batch bn inputs>;
targets = <get batch bn targets>;
net = train(net, inputs, targets);
end
Though batch size should be greater than 1, but anyway that should reduce memory consumtion for training.
In case of trainlm
training alogrithm, net.efficiency.memoryReduction
optim could help.
Also instead of default trainlm
algorithm you can try less memory consuming ones like trainrp
.
For details on training algorithms check matlab documentation page.
I assumed above that you are using corresponding matlab toolbox for neural networks.
Regarding training one sample at a time you could try googling for stochastic gradient descent algorithm. But, it looks like it is not in default set of training algorithm in the toolbox.