Setting up the input on an RNN in Keras

2019-06-12 23:19发布

So I had a specific question with setting up the input in Keras.

I understand that the sequence length refers to the window length of the longest sequence that you are looking to model with the rest being padded by 0's.

However, how do I set up something that is already in a time series array?

For example, right now I have an array that is 550k x 28. So there are 550k rows each with 28 columns (27 features and 1 target). Do I have to manually split the array into (550k- sequence length) different arrays and feed all of those to the network?

Assuming that I want to the first layer to be equivalent to the number of features per row, and looking at the past 50 rows, how do I size the input layer?

Is that simply input_size = (50,27), and again do I have to manually split the dataset up or would Keras automatically do that for me?

1条回答
我欲成王,谁敢阻挡
2楼-- · 2019-06-12 23:29

RNN inputs are like: (NumberOfSequences, TimeSteps, ElementsPerStep)

  • Each sequence is a row in your input array. This is also called "batch size", number of examples, samples, etc.

  • Time steps are the amount of steps for each sequence

  • Elements per step is how much info you have in each step of a sequence

I'm assuming the 27 features are inputs and relate to ElementsPerStep, while the 1 target is the expected output having 1 output per step. So I'm also assuming that your output is a sequence with also 550k steps.

Shaping the array:

Since you have only one sequence in the array, and this sequence has 550k steps, then you must reshape your array like this:

(1, 550000, 28) 
    #1 sequence
    #550000 steps per sequence    
    #28 data elements per step

#PS: this sequence is too long, if it creates memory problems to you, maybe it will be a good idea to use a `stateful=True` RNN, but I'm explaining the non stateful method first. 

Now you must split this array for inputs and targets:

X_train = thisArray[:, :, :27] #inputs
Y_train = thisArray[:, :,  27] #targets

Shaping the keras layers:

Keras layers will ignore the batch size (number of sequences) when you define them, so you will use input_shape=(550000,27).

Since your desired result is a sequence with same length, we will use return_sequences=True. (Else, you'd get only one result).

 LSTM(numberOfCells, input_shape=(550000,27), return_sequences=True)

This will output a shape of (BatchSize, 550000, numberOfCells)

You may use a single layer with 1 cell to achieve your output, or you could stack more layers, considering that the last one should have 1 cell to match the shape of your output. (If you're using only recurrent layers, of course)

stateful = True:

When you have sequences so long that your memory can't handle them well, you must define the layer with stateful=True.

In that case, you will have to divide X_train in smaller length sequences*. The system will understand that every new batch is a sequel of the previous batches.

Then you will need to define batch_input_shape=(BatchSize,ReducedTimeSteps,Elements). In this case, the batch size should not be ignored like in the other case.

* Unfortunately I have no experience with stateful=True. I'm not sure about whether you must manually divide your array (less likely, I guess), or if the system automatically divides it internally (more likely).


The sliding window case:

In this case, what I often see is people dividing the input data like this:

From the 550k steps, get smaller arrays with 50 steps:

X = []

for i in range(550000-49):
    X.append(originalX[i:i+50]) #then take care of the 28th element

Y = #it seems you just exclude the first 49 ones from the original
查看更多
登录 后发表回答