I am working on wireless project which involves simulating 802.11 MAC protocol.
I used random generator in it.The problem is that we are getting wriggly not smooth graphs.I believe the bug is because of random generator. To test I ran the following code which generates 100 random numbers between 0 and 19. If you carefully look at the output, there are several consecutive numbers which are identical or very close (e.g. 17, 15, 16... or 1, 1, ...). It causes collisions to happen in our simulation and the corresponding throughput to fall at that point (i.e. getting wriggly
shape). In this case increasing simulation run time does not help that much.
Can anyone help me figure out how to generate n random numbers in a loop in Java that are really random (without that kind of patterns)?
This is the code to try:
import java.util.Random;
public class RandomTest {
public static void main(String[] args){
int [] counter = new int [20];
Random generator = new Random();
int randomIndex = 0;
for (int i=0; i<100; i++){
randomIndex = generator.nextInt(20);
counter[randomIndex]++;
System.out.println(randomIndex);
}
}
}
Random doesn't mean different every time. Occasional repeated or similar values are expected, especially when you are choosing numbers from such a small range (20 values).
If you do want each number to be different from the preceding number then you have to program that yourself. One of the simplest ways (but not most efficient) is to reject a random number that is within distance x of the previous random number and choose another number - repeat until you get a number you are happy with.
The simplest way to avoid duplicates is to use Collections.shuffle().
List<Integer> ints = new ArrayList<Integer>();
for(int i=0;i<20;i++) ints.add(i);
Collections.shuffle(ints);
Similarly, if you want the values 0 - 19 to appear exactly 5 times each you can do.
for(int i=0;i<100;i++) ints.add(i/5);
I couldn't say if the results you have are normal or not. But if you are not satisfied by java.util.Random
, have a look at java.security.SecureRandom
Your algorithm looks a bit strange to me. You create 20 int and then increments a randomly selected one, and repeat this a 100 times? Why don't you do something like this:
public static int[] randomNumbers(int n, int max) {
Random r = new Random();
int[] rndNums = new int[n];
for (int i = 0; i < n; i++)
rndNums[i] = r.nextInt(max);
return rndNums;
}
Here is a method that avoids duplicates... (there are more efficient solutions for the cases where n is large and approaches max)
public static int[] randomNumbers(int n, int max) {
Random r = new Random();
Set<Integer> taken = new HashSet<Integer>();
int[] arr = new int[n];
for (int rnd, i = 0; i < n; i++) {
while (!taken.add(rnd = r.nextInt(max)));
arr[i] = rnd;
}
return arr;
}
Also, note that the numbers provided by Random
are pseudo-random. Check out True random generation in Java for "true" randomness.
There's no guarantee that a sequence of random numbers won't have duplicates; in fact, for small ranges such as the one you're using, it's very likely ... the real question (and mathematicians have others): are the values evenly distributed over a long sequence?
"Random" does not mean "Evenly distributed." A real random sequence will in fact have occasional clumps of similar numbers, or repeat the same number a few times. If you roll a die three times, won't you once in a while roll 1 three times in a row?
What do you actually want the distribution to look like? You could use a progressing sequence and modify the number returned by the sequence by a random value. This way you can get a shaped result with some random interference.
100 samples in a range of just 0 to 19 is almost certain to get some consecutive value repeats or near repeats. If you want the numbers to all be at least K apart, you could try something like this
randomIndex = (randomIndex + k + generator.nextInt(20 -2*k) % 20;
The -2*k in the range is to prevent the added random amount from 'wrapping around' to within k units of the current value.
Be aware that strictly speaking, this values you get this way aren't as random as the raw values from the rng, but it sounds like pure randomness isn't exactly what you're looking for.
Random creates psudo-randomness that very closely emmulates real randomness. I believe the problem behind the question lies in not understanding real randomness and possibly not understanding the statistical nature of the wireless problem for which the simulation is built.
In using Random to get integers from 0-19, here are what one can expect. The probability of getting any particular number at any particular time is 1:20 and IS NOT dependent on any previous number. This is an essential property of randomness. The probability the next number will match the one just obtained is therefore 1:20. That means, on average, two of the same numbers in a row occur 1 out of every 20 times a number is randomly gotten. That's just the average. Statistically, the occurrance will happen in a poisson distribution. Sometimes matching pairs will happen a twice in row or more. Other times, it won't happen for long sequences of numbers.
The question states that not only is the same number in a row a problem, but a number that is too close in value is also a problem. A supposedly random number generator that won't produce the same number twice in a row is definitely NOT random, it is significantly biased and does not resemble randomness at all. A supposedly random number generator that won't produce successive numbers that are closer together than some predetermined value is biased even more significantly than one that won't produce he same number in succession.
If the 802.11 MAC throughput fails or has wiggly and not smooth graphs because random numbers are the same or too close in value, perhaps a more helpful question is whether the flaw in the wireless design or in the model?
This latter part of the answer is based on what can be gleaned from the sparse details of the question, especially as it describes aspects of 802.11 MAC transmission collisions, random post-collision re-transmission wait times, transmission detection latency and ultimate throughput failures.
A Rand object that randomizes wait time increments for colliding transmitters will give exactly the kind of near randomness needed to analyze collision behaviors. Througput at this level of analysis may not be smooth -- it is wiggly because of the real nature of random behavior by 802.11 MAC stations. If throughput failures occur in the model because of collision issues using this kind of randomness, then it is most likely there are other difficulties in the model or the 802.11 MAC design. At this point it might be useful to discuss rudimentary 802.11 MAC operations relating to collisions.
In 802.11 MAC operations, two stations attempting to transmit at the same time is a common occurrance and is what is usually meant by a collision. Neither station gets use of the communication channel. There is a level of channel use, or, rather, attempted use, that results in what is called congestion failure. A strategy used for such failures was called CSMA/CD, meaning Carrier Sense, Multiple Access/Collision Detection. There are QoS (Quality of Service) complexities that can be added to CSMA/CD to further enhance collision avoidance, but I won't attempt to describe them here.
CSMA/CD is described as follows. Stations listen before transmitting and begin transmission only when another station is not transmitting. A transmitting station detects whether another station collides with it. If collision is detected, a trasmitting station ceases transmission and waits a random period of time before reattempting transmission. Collisions occur because of latency time that will always exist between any two transmitters so that it is always possible for two well-behaved transmitters to still collide.
There are two contributors to latency time. The first is the time it takes for the transmission medium to bring a signal from one transmitter to another. The second is the time it takes for a transmitter, once it detects no other station is transmitting to beginning its own transmission. This total latency simultaneously affects all possible transmitters. Another frequent occurrance is two stations randomly waiting the same amount of time for attempting transmission again, in which case, if some other station hasn't already begun transmitting and that transmission been detected, they will collide again and continue colliding until one waits a sufficiently different wait time than the other for a collision to not occur again.
A situation called congestive failure occurs if too many stations are all attempting to transmit at the same time and then randomly waiting to attempt retransmission. Random wait times have to be longer enough than latency times for all differing wait times to be different enough for one to sense the carrier of the other before attempting a retransmission. For instance, wait time increments that are half the length of latency times will reduce the effective pool of available wait times in half. It therefore makes no sense whatsoever to have a wait time increment that is less than latency time. Furthermore, the available pool of wait times from which to choose should have a sufficient range of values that wait time collisions occur infrequently.
I hope this clarifies several things -- the nature of Random, the nature of random sequences and the nature of collisions in 802.11 MAC type networks.
http://en.wikipedia.org/wiki/Cryptographically_secure_pseudorandom_number_generator