Suppose I have very big train set so that Matlab hangs while training or there is insufficient memory to hold train set.
Is it possible to split the training set into parts and train the network by parts?
Is it possible to train the network with one sample at a time (one by one)?
You can just manually divide dataset into batches and train them one after one:
Though batch size should be greater than 1, but anyway that should reduce memory consumtion for training.
In case of
trainlm
training alogrithm,net.efficiency.memoryReduction
optim could help. Also instead of defaulttrainlm
algorithm you can try less memory consuming ones liketrainrp
. For details on training algorithms check matlab documentation page. I assumed above that you are using corresponding matlab toolbox for neural networks.Regarding training one sample at a time you could try googling for stochastic gradient descent algorithm. But, it looks like it is not in default set of training algorithm in the toolbox.