I want my neural network to convert a negative value into a positive value. Theoretically this can be done using a ReLU function and 1 node which learns the input weight to be -1 (so a negative input is multiplied by -1 = positive input.
It just keeps on outputting 0. Code below. I used -1 as input values to see if it could learn on at least a single input.
I tried adding more layers but it doesn't help see edit, IT DID help if I add more
train_input = np.asarray([[-1]]*10000) # Input arr of -1s
train_output = np.asarray(map(lambda x: [abs(x[0])] , train_input))
# Define the model
model = Sequential()
model.add(Dense(1, input_dim=1, kernel_initializer='normal', activation='linear'))
model.add(LeakyReLU(alpha=.001))
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
# Train and evaluate
model.fit(train_input, train_output, epochs=10, batch_size=10, verbose=0)
test_model_output = model.predict(test_input)
print str(test_input[0][0]) + " " + str(test_output[0][0]) + " " + str(test_model_output[0][0])
The output I get is below (1st value is input, 2nd is expected output, 3rd is model output )
-1 1 0.0
EDIT I tried using the random uniform initialiser so it would initialise negative weights and it works. I get why this should make it easier for the network to learn. But I don't get why it's necessary.
from keras.initializers import RandomUniform
model.add(Dense(1, input_dim=1, kernel_initializer=RandomUniform(minval=-0.05, maxval=0.05, seed=None), activation='linear'))
EDIT 2 Someone mentioned I didn't have enough time to train the data. At first I thought making it 10x more data and batches to be 10x smaller (more iterations) would work. It didn't BUT if I added 10x more epochs (100 total) it did work. So it just takes a long time to convert positive initialised weights to negative