I came across this question on a programming challenge about a month ago, but the editorial wasn't released so I am asking it here.
There is an Array A of size N. Find the sum * GCD of K length subsequences of A.
Example:
If A = [1, 2, 3] and K = 2,
{1, 2} = 3(sum) * 1(GCD) = 3
{1, 3} = 4(sum) * 1(GCD) = 4
{2, 3} = 5(sum) * 1(GCD) = 5
Ans => 3 + 4 + 5 = 12
Here it is from scratch (didn't test it thoroughly though):
Let C[k, i, d]
be the number of all k
-length subsequences of A[1..i]
such that GCD for them is equal to d
.
Then
C[k, i, d] = Sum(C[k - 1, i - 1, d']) + C[k, i - 1, d]
where summation is over all d'
such that gcd(A[i], d') = d
.
The first term (the sum) corresponds to the case when we take all the sequences from A[1..i-1]
and attach A[i]
to them. The last term - when we don't include the A[i]
.
Let S[k, i, d]
be the total sum of all k
-length subsequences of A[1..i]
such that GCD for them is equal to d
.
Then
S[k, i, d] = Sum(C[k - 1, i - 1, d'] * A[i] + S[k - 1, i - 1, d']) + S[k, i - 1, d]
where summation is over all d'
such that gcd(A[i], d') = d
.
Then having S[k, i, d]
and knowing all possible d
values we can calculate the required value.