I just didn't get it, why the time complexity is O(n^2) instead of O(n*logn)?
The second loop is incrementing 2 each time so isn't it O(logn)?
void f3(int n){
int i,j,s=100;
int* ar = (int*)malloc(s*sizeof(int));
for(i=0; i<n; i++){
s=0;
for(j=0; j<n; j+=2){
s+=j;
printf("%d\n", s);
}
free(ar);
}
By incrementing by two, rather than one, you're doing the following N*N*(1/2)
. With big(O) notation, you don't care about the constant, so it's still N*N. This is because big(O) notation reference the complexity of the growth of an algorithm.
Outer loop will execute for n times and for each outer loop iteration inner loop will execute for n/2 times because j +=2
Order = n×n/2 = n^2/2 =O(n^2) because constant doesn't affect run time for large n
there is an increment of 2 so loop will runs for n/2.So the complexity will be n*n/2.
if increament happened in GP like 2