Neural Network for regression

2019-08-11 00:11发布

The way I understand regression for neural networks is weights being added to each x-input from the dataset. I want something slightly different.

I want weights added to the function that computes each x-input we'll call these s-inputs

The function to compute the x-inputs is a summation function of all s-inputs I want each s-input to have its own weight

So I say regression because I want the end result to be a beautiful continuous function between the mapping x -> y

...but that is accomplished through training the function that computes the x-inputs

It's baffling me because as we train the weights to compute say, x1 we are also training the weights to compute x2 since they are using the same summation function. So since the function to compute x-inputs is being trained simultaneously across all x-inputs, the plot x -> y will begin to morph. I need it to morph into something continuous.

You can think of it like this. The y-value is the ground truth, but we are adding weights to the function that computes the x-value -- the s-inputs

Can this be done? If so where should I start?

0条回答
登录 后发表回答