I've stayed up far too long today, so I just had one of my crazy ideas. I'd like to see how long it takes for a random number generator, on average, to guess a correct number when it is used to play the "higher or lower" game, at different ranges of starting numbers. An example:
I choose the number 63, out of the numbers 1 through 100.
I set the RNG to pick a random number from 1 to 100.
It outputs 41.
I set it to pick a number from 42 to 100.
It outputs 88.
I set it to pick a number from 42 to 87.
It outputs 55.
I set it to pick a number between 56 and 87.
It outputs 85.
I set it to pick a number between 56 and 84.
It outputs 63.
It took the RNG five guesses to output the correct number.
I'd like to do this experiment a large number of times with several different ranges of numbers, in order to see if there is a correlation between the size of the range and the number of guesses the RNG takes. The hypothesis is that there is.
So, a list of ranges to test:
1-10
1-25
1-50
1-100
1-250
1-500
1-1000
1-2500
1-5000
1-10000
Each would be tested a large number of times (any idea how many?) and then the guesses would be averaged for each range.
This is purely for fun, so there is one "rule." Researching is forbidden until all results are in. Even if you know how the results should be before they are in, they are to be kept a secret. Think of it as not spoiling the ending of a movie for someone who hasn't seen it.
So, if you'd like to take place in this experiment, just pick one of the ranges and begin. Use
http://www.random.org/ (
http://www.random.org/). Post your results here.
Thanks, and have fun!