lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Mon, Mar 19, 2007 at 08:09:59PM -0300, Luiz Henrique de Figueiredo wrote:
> Anyway, the problem seems to be that when the seeds differ very little
> the first value of generated by BSD rand() also differ very little. This
> difference is lost when Lua converts the integer returned by rand() into
> a real number, effectively preserving only the high bits in the result.
> When you call math.random(1,100) from Lua, the low-bit difference vanishes
> and you see the same integer result.
> 
> The Windows code is probably different but not in any essential way: rand
> probably uses a classical Linear congruential generator and the same loss
> of bits will occur.
> 
> Lua relies on rand from libc. It must not be too clever about what it does
> with the result of rand lest it breaks its randomness (which may be weak
> already). So I think the fix for Lua is to call rand() a couple of times
> whenever the seed changes.

I notice rand(3) on linux says:

	The versions of rand() and srand() in the Linux C Library use the same
	random number generator as random() and srandom(), so the lower-order  bits
	should  be as random as the higher-order bits.  However, on older rand()
	implementations, and on current implementations on different systems, the
	lower-order bits are much less random than the higher-order bits.  Do not use
	this function in applications intended to be portable when good ran‐ domness is
	needed.

	In  Numerical  Recipes  in  C: The Art of Scientific Computing (William H.
	Press, Brian P. Flannery, Saul A. Teukolsky, William T. Vetterling; New York:
	Cambridge University Press, 1992 (2nd ed., p. 277)), the following comments are
	made:

              "If you want to generate a random integer between 1 and 10, you should always do it by using high-order bits, as in

                     j = 1 + (int) (10.0 * (rand() / (RAND_MAX + 1.0)));

              and never by anything resembling

                     j = 1 + (rand() % 10);

              (which uses lower-order bits)."

	Random-number generation is a complex topic.  The Numerical Recipes in C
	book (see reference above) provides an excellent discussion of  practical
	random-number generation issues in Chapter 7 (Random Numbers).


This seems to describe whats happening, the variation in return value is being
lost when requesting random values in an interval.

I wonder if some variant of the advice could be followed to allow the return
value to be more completely utilized, without having to resort to tricks like
calling a rand() a few times to discard the start of the run?

Cheers,
Sam