Any alternative to a (very slow) deepcopy in a DFS

2019-02-25 19:15发布

I have a nice graph (a list) containing 81 vertices (each vertex is an instance of a Vertex class). Each vertex has 20 neighbors. Each vertex has a number of possible values (ranging from 1 to 9) which, given some initial constraints to the problem will be on average 4 or 5. I implemented a simple DFS on this graph, that takes the node with less possible values, foreach value builds another "deepcopied" graph having only one of the possible value, and finally passes the "deepcopied" graph again into the DFS recursively. The issue is about speed; cProfiling my code I found out that 635 of the 641 second that my Mac takes to solve this problem are used by copy.deepcopy. Are there any workarounds for this issue? Here is my DFS:

def dfs(graph):
    global initial_time_counter

    if all(len(i.possible_values)==1 for i in graph):
        sys.exit("Done in: %s" % (time.time()-initial_time_counter))

    #find out the non-solved vertex with minimum possible values
    min_vertex=sorted(filter(lambda x: len(x.possible_values)>1,graph),
                      key=lambda x: len(x.possible_values))[0]

    for value in min_vertex.possible_values:

        sorted_graph_copy=sorted(copy.deepcopy(graph), key=lambda x: len(x.possible_values))
        min_vertex_copy=filter(lambda x: len(x.possible_values)>1,sorted_graph_copy)[0]
        sorted_graph_copy.remove(min_vertex_copy)

        if min_vertex_copy.try_value(value): #Can this vertex accept value -> value?
            min_vertex_copy.set_value(value) #Yes, set it.
            sorted_graph_copy.append(min_vertex_copy) #Append it to the graph.
            dfs(sorted_graph_copy) #Run the DFS again.
    return False

P.S. as the smartest of you might have understood this problem is usually called sudoku. Please note that I'm not looking for answers specific to sudoku, analyze the problem in an abstract way.

[Edit]

The same problem, approached with pure string representations of vertices, took < 0.75 sec to be solved. I'm posting the whole code for reference if anyone experiences a similar problem in the future:

import sys,time

def srange():
    return [[x,y] for x in range(9) for y in range(9)]

def represent_sudoku(sudoku):
    print "\n".join(["|".join([str(elem) for elem in line]) for line in sudoku])

#Hard sudoku
sudoku=[[4, 0, 0, 0, 0, 0, 8, 0, 5], [0, 3, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 7, 0, 0, 0, 0, 0], [0, 2, 0, 0, 0, 0, 0, 6, 0], [0, 0, 0, 0, 8, 0, 4, 0, 0], [0, 0, 0, 0, 1, 0, 0, 0, 0], [0, 0, 0, 6, 0, 3, 0, 7, 0], [5, 0, 0, 2, 0, 0, 0, 0, 0], [1, 0, 4, 0, 0, 0, 0, 0, 0]]

represent_sudoku(sudoku)

def get_nbs(x,y,sudoku,also_incomplete=False):
    line_nbs=sum([elem for elem in sudoku[y] if ((elem!=[0] and len(elem)==1) or also_incomplete)],[])

    column_nbs=sum([sudoku[xline][x] for xline in range(9) if ((sudoku[xline][x]!=[0] and len(sudoku[xline][x])==1) or also_incomplete)],[])


    area_nbs=[[j for j in i[(x/3)*3:(x/3)*3+3] if ((j!=[0] and len(j)==1) or also_incomplete)] for i in sudoku[(y/3)*3:(y/3)*3+3]]

    area_nbs=sum(sum(area_nbs,[]),[])    

    if not also_incomplete:
        return list(set(line_nbs+column_nbs+area_nbs))

    return line_nbs+column_nbs+area_nbs

for x,y in srange():
    sudoku[y][x]=[sudoku[y][x]]

def base_cleanup(sudoku):
    while 1:
        something_changed=False
        for x,y in srange():
            if sudoku[y][x]==[0] or len(sudoku[y][x])>1:
                possible_values=range(1,10) if sudoku[y][x]==[0] else sudoku[y][x]
                sudoku[y][x]=list(set(possible_values)-set(get_nbs(x,y,sudoku)))
                if sudoku[y][x]==[]:
                    return False
                something_changed=True if possible_values!=sudoku[y][x] else False
            else:
                sudoku[y][x]=sudoku[y][x]
        if not something_changed:
            break
    return sudoku


def dfs(graph):
    global s

    if graph==False:
        return False

    if all(sum([[len(elem)==1 for elem in line] for line in graph],[])):
        represent_sudoku(graph)
        sys.exit("Done in: %s" % (time.time()-s))

    enumerated_filtered_sudoku=filter(lambda x: len(x[1])>1, enumerate(sum(graph,[])))
    sorted_enumerated_sudoku=sorted(enumerated_filtered_sudoku,key=lambda x: len(x[1]))
    min_vertex=sorted_enumerated_sudoku[0]

    possible_values=[value for value in min_vertex[1]]

    for value in possible_values:        
        graph_copy=[[elem for elem in line] for line in graph]

        y,x=elements_position[min_vertex[0]]

        if not any(value==i for i in get_nbs(x,y,graph_copy)):
            graph_copy[y][x]=[value]
            if base_cleanup(graph_copy)!=False:
                graph_copy=base_cleanup(graph_copy)
                if graph_copy:
                    dfs(graph_copy)

    return False

sudoku = base_cleanup(sudoku)

elements_position = {i:srange()[i] for i in range(81)}
s = time.time()

dfs(sudoku)

4条回答
虎瘦雄心在
2楼-- · 2019-02-25 19:37

Deepcopy can be a lot slower than simply copying the same amount of data, presumably because of all the effort that goes into detecting loops. If you copy your graph yourself in a way that avoids loops (easy, since you know the network topology) instead of delegating to deepcopy, it's likely to give you a serious speed-up. I got a 50% speedup for copying a very simple data structure element-wise (using comprehensions), and it wouldn't surprise me if complex ones gave even bigger savings.

Of course there's a bigger speedup to be gained if you can avoid making a complete copy of your entire state at each step. For example, since you're searching depth first, you could switch your approach to maintaining an undo stack: just record each list of choices you selected from, and restore them as you backtrack.

查看更多
Luminary・发光体
3楼-- · 2019-02-25 19:50

cPickle is faster than deepcopy:

Line # Hits Time Per Hit % Time Line Contents

15                                           @profile
16                                           def test():
17       100       967139   9671.4     95.0      b = deepcopy(a)
18                                               #c = copy(a)
19                                               #d = ujson.loads(ujson.dumps(a))
20                                               #e = json.loads(json.dumps(a))
21                                               #f = pickle.loads(pickle.dumps(a, -1))
22       100        50422    504.2      5.0      g = cPickle.loads(cPickle.dumps(a, -1))
查看更多
爱情/是我丢掉的垃圾
4楼-- · 2019-02-25 19:50

do you need to copy the whole graph? typically you are only going to modify a small part of it at any step in the search, so it may be more efficient to use immutable data structures and only rebuild what is needed. this doesn't work for loops, but i suspect your graph is a list?

i solved a similar problem in clojure, which has native support for immutable structures, and managed to get reasonable efficiency in this way. but i don't know of any immutable datastructure libraries for python (there is a copy-on-write list package somewhere - would that be sufficient?)

[just to clarify with a simple example - you're right that tuples are immutable, so if you had a tree made of tuples like this: (1, (2, 3), (4, (5, 6))) then you could generate a new tree like this (1, (2, 99), (4, (5, 6))) by creating just two tuples - you can copy the (4, (5, 6)) without doing a deep copy. what clojure has, and what i don't know of for python, are much more complex structures (like hash tables) that follow the same principle. it makes dfs really easy, because you don't have to worry about changing values "upstream" at all.]

ps it was only through doing what you are doing, and seeing the costs involved, that i understood the elegance of norvig's approach...

查看更多
叛逆
5楼-- · 2019-02-25 19:57

I'm not fully cognizant of the sudoku specifics, but what if you use a single copy of your graph and give each node a class that includes:

1) The list of adjacent vertices 2) A "visited" flag that can be used to keep track of what's already been seen and what hasn't

?

Your Depth First Search can still be recursive, but instead of packaging up modified subsets of your graph, you just mark a node "visited" and continue on until there's nowhere else to go. If your graph is connected, it seems like it should work...

查看更多
登录 后发表回答