Is there any published O(log b) algorithm to determine if a b-bit number is square of an integer?
(I apologize if this question is beyond the scope of this site and would happily retrieve it if so)
Update: I realize that my question as posed is unreasonable. So let me amend it by asking for any algorithm that is sub polynomial operations in b. Not necessarily O(log^k b) for a constant k, and has "reasonable" space complexity. Operations are defined in the usual sense: something that is reasonable for the task at hand (e.g. add, negate, xor, and, or, etc.)
Post script: Now I realize that my question is nonsense. The cost of computing floor square root of an n-bit number is less than multiplying two n-bit numbers.
- if
b
is small enough then with use of sqrt
table is complexity O(1)
- if not then with use of bit approximation is complexity
O(b/2) = O(log n)
with use of perfect square table is the complexity also O(b/2) = O(log n)
but with much faster operations (just compare and bit operations) b
bit number can be a perfect square of max b/2
bit number so the table has 2^(b/2)
entries of b
bit numbers and approximate index search (similar to binary search) takes b/2
steps
some improvement can be done with approximation
create approx function y=approx_sqrt(x);
and compute y
now you can just check values from < y-c , y+c >
which have runtime ~T(2c)
where c
is constant dependent on the approximation accuracy (1,2,3,...)
. Most approximations are off on the bigger values so you can make c=log(b)
and your complexity is suddenly O(log b) = O(log log n)
which is what you are searching for I think.
[Edit]
well after down-vote I found that mark down hides part of my text
assume some formatting or whatever by < interval >
so I added few spaces to un-hide it and also found that bullet #5 was wrong so I deleted it. If that was the reason for down-vote than thanks for it I overlooked it ... was doing something with primes recently ... so my brain copy it there without thinking it thorough
as usually without code and Wiki page there are some of you that do not belive and just down-vote so here is my tested implementation of above:
//---------------------------------------------------------------------------
int is_square(int x,int &cnt) // return sqrt(x) if perfect or 0, cnt = num of cycles ~ runtime
{
int y,yy;
// y=aprox_sqrt(x)
for (y=x,yy=x;yy;y>>=1,yy>>=2); // halves the bit count
if (y) y=(y+(x/y))>>1; // babylonian approximation
if (y) y=(y+(x/y))>>1;
if (y) y=(y+(x/y))>>1;
// check estimated y and near values
cnt=1;
yy=y*y; if (yy==x) return y;
if (yy<x) for (;;)
{
cnt++;
y++;
yy=y*y;
if (yy==x) return y;
if (yy> x) return 0;
}
else for (;;)
{
cnt++;
y--;
yy=y*y;
if (yy==x) return y;
if (yy< x) return 0;
}
}
//---------------------------------------------------------------------------
I added cnt for you so you can test the complexity yourself. My used approx need a good start value so I use halving the bit count which is obviously O(log b)
but for bignums and float
values the exponent/bit count is already known so it converts just to single bit/word/base/exponent shift O(1)
. BTW that is the IEEE float magic done by most of the approximations for sqrt
or log
functions.
My measurements are better than my first estimates (even for non-precise babylonian approximation):
/*----------------
| N | T |
------------------
| 1000000000 | 6 |
| 100000000 | 4 |
| 10000000 | 2 |
| 1000000 | 2 |
----------------*/
where N
is loop N
on which it was tested. T
is max cnt
value on tested numbers < 0,N >
for different approximation (better suited for your needs) can look here
So my answer to your question is YES it does exist an algorithm faster than O(log n)
for determining if n
is perfect square (for instance mine above which also compute the sqrt
) but if you are also counting the complexity of basic functions than I am afraid that the answer is NO because even bit operations are O(log n)
on bignums !!!
BTW the binary search sqrt
can be done also without multiplication.
Hope it helps.