Suppose I have an this rdd:
val r = sc.parallelize(Array(1,4,2,3))
What I want to do is create a mapping. e.g:
r.map(val => val + func(all other elements in r)).
Is this even possible?
Suppose I have an this rdd:
val r = sc.parallelize(Array(1,4,2,3))
What I want to do is create a mapping. e.g:
r.map(val => val + func(all other elements in r)).
Is this even possible?
Spark already supports Gradient Descent. Maybe you can take a look in how they implemented it.
I don't know if there is a more efficient alternative, but I would first create some structure like:
It's very likely that you will get an exception, e.g. bellow.
i.e. you are trying to broadcast the
RDD
therefore.To achieve this you would have to do something like this: