The following code is for prime number. I want to know why we use i<=n/2
condition in the loop.
C Program:
#include <stdio.h>
int main()
{
int n, i, flag = 0;
printf("Enter a positive integer: ");
scanf("%d",&n);
for(i=2; i<=n/2; ++i)
{
// condition for nonprime number
if(n%i==0)
{
flag=1;
break;
}
}
if (flag==0)
printf("%d is a prime number.",n);
else
printf("%d is not a prime number.",n);
return 0;
}
Although this is C program. But prime number logic will be same for C and Java both
Prime number
Each natural number that is divisible only by 1 and itself is prime.
Also, 2 is the first prime number.
For example, we want to test that number 100 is a prime number or not. we can do a trial division to test the primality of 100.
Let's look at all the divisors of 100:
2, 4, 5, 10, 20, 25, 50
Here we see that the largest factor is 100/2 = 50. This is true for all n: all divisors are less than or equal to n/2.
So here condition i<=n/2 condition is correct. Since we need to test divisors up to n/2 only.
Please check the Wiki link for more detail
https://en.wikipedia.org/wiki/Primality_test
Second example
Similarly, for 11 you would check all integers smaller than 5.5, i.e. 1, 2, 3, 4 and 5.
To find a number is prime, Why checking till n/2 is better. What is the reason for avoiding numbres in second half of n
The largest factor for any number n must be <= n/2, so no need to check the bigger numbers