Very fast sampling from a set with fixed number of

2019-06-17 18:55发布

问题:

I need to sample uniformly at random a number from a set with fixed size, do some calculation, and put the new number back into the set. (The number samples needed is very large)

I've tried to store the numbers in a list and use random.choice() to pick an element, remove it, and then append the new element. But that's way too slow!

I'm thinking to store the numbers in a numpy array, sample a list of indices, and for each index perform the calculation.

  • Are there any faster way of doing this process?

回答1:

Python lists are implemented internally as arrays (like Java ArrayLists, C++ std::vectors, etc.), so removing an element from the middle is relatively slow: all subsequent elements have to be reindexed. (See http://www.laurentluce.com/posts/python-list-implementation/ for more on this.) Since the order of elements doesn't seem to be relevant to you, I'd recommend you just use random.randint(0, len(L) - 1) to choose an index i, then use L[i] = calculation(L[i]) to update the ith element.



回答2:

I need to sample uniformly at random a number from a set with fixed size, do some calculation, and put the new number back into the set.

s = list(someset)           # store the set as a list
while 1:
    i = randrange(len(s))   # choose a random element
    x = s[i]
    y = your_calculation(x) # do some calculation
    s[i] = y                # put the new number back into the set


回答3:

random.sample( a set or list or Numpy array, Nsample ) is very fast, but it's not clear to me if you want anything like this:

import random

Setsize = 10000
Samplesize = 100
Max = 1 << 20
bigset = set( random.sample( xrange(Max), Setsize ))  # initial subset of 0 .. Max

def calc( aset ):
    return set( x + 1 for x in aset )  # << your code here

    # sample, calc a new subset of bigset, add it --
for iter in range(3):
    asample = random.sample( bigset, Samplesize )
    newset = calc( asample )  # new subset of 0 .. Max
    bigset |= newset

You could use Numpy arrays or bitarray instead of set, but I'd expect the time in calc() to dominate.

What are your Setsize and Samplesize, roughly ?