The N-Queens Problem:
This problem states that given a chess board of size N by N, find the different permutations in which N queens can be placed on the board without any one threatening each other.
My question is:
What is the maximum value of N for which a program can calculate the answer in reasonable amount of time? Or what is the largest N we have seen so far?
Here is my program in CLPFD(Prolog):
generate([],_).
generate([H|T],N) :-
H in 1..N ,
generate(T,N).
lenlist(L,N) :-
lenlist(L,0,N).
lenlist([],N,N).
lenlist([_|T],P,N) :-
P1 is P+1,
lenlist(T,P1,N).
queens(N,L) :-
generate(L,N),lenlist(L,N),
safe(L),
!,
labeling([ffc],L).
notattack(X,Xs) :-
notattack(X,Xs,1).
notattack(X,[],N).
notattack(X,[Y|Ys],N) :-
X #\= Y,
X #\= Y - N,
X #\= Y + N,
N1 is N + 1,
notattack(X,Ys,N1).
safe([]).
safe([F|T]) :-
notattack(F,T),
safe(T).
This program works just fine, but the the time it takes keeps on increasing with N. Here is a sample execution:
?- queens(4,L).
L = [2, 4, 1, 3] ;
L = [3, 1, 4, 2] ;
No
This means you place the 4 queens at Row 2 in Column1, Row 4 in Column 2, Row 1 in 3 and Row 3 in 4.(In a 4 By 4 chess board)
Now lets see how this program performs(Time taken in calculating the first permutation):
For N=4,5.....10 Computes within a second
For N=11-30 Takes between -1-3 seconds
For N=40..50 Still calculates within a minute
At N=60 It goes out of Global stack(Search space being enormous).
This was a past Homework problem. (The original problem was just to code N-Queens)
I am also interested in seeing alternate implementations in other languages(which performs better than my implementation) or If there is room for improvement in my algorithm/program
This fascinating lack of predictability in backtrack-complexity for different board sizes was the part of this puzzle that most interested me. For years I've been building a list of the 'counts' of algorithm steps needed to find the first solution for each board size - using the simple and well known depth-first algorithm, in a recursive C++ function.
Here's a list of all those 'counts' for boards up to N=49 ... minus N=46 and N=48 which are still work-in-progress:
http://queens.cspea.co.uk/csp-q-allplaced.html
(I've got that listed in the Encyclopedia of Integer Sequences (OEIS) as A140450)
That page includes a link to a list of the matching first solutions.
(My list of First Solutions is OEIS Sequence number A141843)
I don't primarily record how much processing time each solution demands, but rather I record how many failed queen-placements were needed prior to discovery of each board's algorithmically-first solution. Of course the rate of queen placements depends on CPU performance, but given a quick test-run on a particular CPU and a particular board size, it's an easy matter to calculate how long it took to solve one of these 'found' solutions.
For example, on an Intel Pentium D 3.4GHz CPU, using a single CPU thread -
My current 2.8GHz i7-860 is thrashing through about 28.6 million queens per second, trying to find the first solution for N=48. So far it has taken over 550 days (theoretically, if it had never been uninterrupted) to UNsuccessfully place 1,369,331,731,000,000 (and rapidly climbing) queens.
My web site doesn't (yet) show any C++ code, but I do give a link on that web page to my simple illustration of each one of the 15 algorithm steps needed to solve the N=5 board.
It's a delicious puzzle indeed!
There is no limit. That is, checking for the validity of a solution is more costly than constructing one solution plus seven symmetrical ones.
See Wikipedia: "Explicit solutions exist for placing n queens on an n × n board for all n ≥ 4, requiring no combinatorial search whatsoever.".
I dragged out an old Delphi program that counted the number of solutions for any given board size, did a quick modification to make it stop after one hit and I'm seeing an odd pattern in the data:
The first board that took over 1 second to solve was n = 20. 21 solved in 62 milliseconds, though. (Note: This is based off Now, not any high precision system.) 22 took 10 seconds, not to be repeated until 28.
I don't know how good the optimization is as this was originally a highly optimized routine from back when the rules of optimization were very different. I did do one thing very different than most implementations, though--it has no board. Rather, I'm tracking which columns and diagonals are attacked and adding one queen per row. This means 3 array lookups per cell tested and no multiplication at all. (As I said, from when the rules were very different.)
Now for some real insanity: 29 took 9 seconds. 30 took almost 6 minutes!
a short solution presented by raymond hettinger at pycon: easy ai in python
computing all permutations is not scalable, though (
O(n!)
)As to what is the largest N solved by computers there are references in literature in which a solution for N around 3*10^6 has been found using a conflict repair algorithm (i.e. local search). See for example the classical paper of [Sosic and Gu].
As to exact solving with backtracking,there exist some clever branching heuristics which achieve correct configurations with almost no backtracking. These heuristics can also be used to find the first-k solutions to the problem: after finding an initial correct configuration the search backtracks to find other valid configurations in the vicinity.
References for these almost perfect heuristics are [Kale 90] and [San Segundo 2011]
If you only want 1 solution then it can be found greedily in linear time O(N). My code in python:
Here, however printing takes O(N^2) time and also python being a slower language any one can try implementing it in other languages like C/C++ or Java. But even with python it will get the first solution for n=1000 within 1 or 2 seconds.