I am new to PySpark and I am trying to understand how can we write multiple nested for loop in PySpark, rough high level example below. Any help will be appreciated.
for ( i=0;i<10;i++)
for ( j=0;j<10;j++)
for ( k=0;k<10;k++)
{
print "i"."j"."k"
}
In non distributed setting for-loops are rewritten using
foreach
combinator, but due to Spark naturemap
andflatMap
are a better choice:Of using
itertools.product
: