lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


Note that this also applies to other cases where one wants reproducibility, notably in multiplier games, if the PRNG is used to generate a map or a sequence of events that all players in the same party will share: this allows the game server to not host the maps or not serve all events themselves, each client will see the same party on the same conditions (the game server or clients may impose challenges to assert that other players are not "cheating" with a modified sequence of PNRG-generated events or maps with collectable objects...

Reproductible PRNG are quite common and wanted in many games... But always undesired in security solutions for proofs of authentication, non-repudiation and all uses of PKI ans strong cryptography (notably the generation of public/private key pairs: a strong RNG must not be easily reproductible and must act as a totally opaque oracle or black box with unpredictable responses, just like thermal noise in large masses of fluids with a very high number of particles with uncountable zillions of degrees of freedom, and like desintegration events at quantic scales, where all that can be measured is some statistics possibly with some non-purely flat distribution of probabilities which a RNG can accomodate for a sufficiently large number of measurement types: strong security requires a minimum level of "entropy", which is commonly measured in bits per second; really strong RNGs with military grade use entropy sources that require a number of new free particles in the order of the "mole per second", i.e. in the order of the Avogadro constant number each second).

--

Recall: Avogadro constant is 6.02214076.10^23 per mole (in the past it was measured, but a few months ago has been **fixed as a constant** in order to redefine the "kilogram" in SI, instead of defining it based on a reference cylinder of metal and a few copies, that proved to be vry difficult to maintain in stable state, so that all references gained about 15 to 50 micrograms by century, without any way to determine which reference was the best and why some copies had gained or sometimes lost some mass; the adoption of the Avogadro constant in the international conference in Paris, based on recent measurements and comparisons and averaging of copies, means that all SI units are no longer based on physical realizations, but based on universal physical constants; now the reference pieces of metals will be measured to study why their mass vary over time; there are complex reasons, including those caused on mesurement devices by variation of gravity, temperature, environmental gas, natural oxydation, effects of electromagnetic fields including radio on their surface... The adoption of the new universal kilogram is a very significant step for increasing the precision of measurements in all branches of physics by several orders or magnitude, and was required for advances in fundamental physics of particles and in cosmology, notably for the research on black holes, black matter, black energy and all aspects of quantic physics can we see other measurable and predictable quantities or find new interactions in what was previously considered as "unpredictable random noise" ?).

If fundamental physics discovers new laws of interactions, may be we'll need in the future more entropy sources to reach the future "military grade", but for now an entropy source of one mole per second is sufficient to generate 79 bit/s (with the currently known physics applicable to all we can see at all measurable scales, from the very huge size of our visible universe, to the very small unit length of Plank, both being our current "horizon" through which we still can't see anything with the interactions we know and can currently observe and measure); if new interactions are discovered, new SI units will be adopted for them as (several proposed models suggest that our universe has more than 4 dimensions and that at least 3 of them are missing to our eyes to explain black mattier or black energy, or why our universe still seems to inflate with an accelerating speed, and why also "black holes" are emitting "flashes" of energy in very tight directions; the rationalisation of SI will allow such discoveries of missing fundamental laws and units, by first allowing all our measurement devices to improve their precision).

So a military grade RNG would generate at least 79 bit/s of entropy from a flow of of 1 mol/s of fluid, or 16 grammes of oxygen per second (so this does not require a large device installation). You can even collect more bits if you use different fluids featuring measurable electromagnetic effects, and you can reduce the size of the device using a liquid instead of a gas: 16 grammes of water per second is like flowing a small cup of coffea each second. To reduce even more the device, you may use thermal noise in some complex solids with "heavy" molar mass, like iron (the most stable metal) or even better with lead, gold, platine, uranium (which also have a measurable radioactivity with "random" decay events with a sufficient frequency)...

Inside any electronic chip working with thermal noise of electrons for switching on/off an NP gate in a single transistor, this requires just a flow of  6.02214076*10^23 electrons per second, and this current (measured in Amperes) is not very huge: it is easy to include a strong RNG in any chip. If you don't get enough bits or the amperage is too strong because of thermic constraints in chips with dense design where gates reach the size of very few atoms, where such current would rapidly damage the NP gate and the counter, we can just integrate more NP gates in the chip: today's chips have millions of transistors, CPUs and RAM device have billions, and CPUs now integrate thermal sensors spread everywhere on their design to avoid damages and locally adapt the speed of processing... All modern CPUs should be able to collect these sources of entropy to build a strong RNG, featured directly in their instruction set (however,  PRNGs implemented in software should not be based on "high-performance" counters inside CPUs, they were not designed to measure any "random" noise, but only predictable events) !

You also don't need to use a radioactive source measuring the variable time between desintegrations (this generates a very huge entropy per second, even from a mater with modest radioactivity level, measured in Grays or Hertz, such as natural carbon, however its mean decay period is too long, so even if this generates enough bit/s, most new bits will come in large chunks of bits, but only in too rare instants; military grade RNG preferably use more active materials such as natural uranium or cesium, but this requires heavy and costly protection for protecting the health of human operators of the device and still keeping the device mobile and easy to transport).

In combustion engines, you can also use the thermal noise of explosions or in the flow of ejected gas and particles, but the problem is the qulity of sensors and their rapid degradation. Acoustic noise in free air (e.g.in thermal regulators) is also used sometimes, but with a poor collect by micros, and it is difficult to isolate the RNG in a small blackbox as needed for an opaque oracle and this source is a bit sensible to the internal data that applciations want to protect; some smartphones are using the thermal noise of their camera sensors between captured snapshots, when the sensor is obturated or discharged: the electrons randomly collected on the CCD sensor and instantly dissipated when not capturing an image, are very good sources of entropy, if there's a small counter of these small discharges, it is the same noise you can easily see on a camera snapshot in the darkest areas of the image, before the raw image is numerically processed to eliminate this noise by a Gaussian-like filter, but the filter does not eliminate the noise completely in the darkest areas as it would also be masking lot of details needed for photographic quality everywhere else)



Le dim. 24 mai 2020 à 15:40, Philippe Verdy <verdyp@gmail.com> a écrit :
One solution: remove any assertion of quality from the reference; put that instead in the release notes or documentation of the provided default implementation, adding all information needed so that developers can choose the implementation or tune their application as needed, using this implementation or writing their own.

At least this is documented somewhere in the appropriate place. The reference will be updated only when there's agreement about the minimum level that implementations must be conforming to.

And an application having strict requirements on random generators, or that depend on a specific algorithm (for long term reproducibility, e.g. in puzzle-like games where each challenge in the game is fed by an initial seed that can be reproduced at any time and anywhere) should provide their own RNG (if they want higher security than the minimum set in specifications) or their own PRNG (if they want reproductibility).


Le mer. 20 mai 2020 à 16:37, Lorenzo Donati <lorenzodonatibz@tiscali.it> a écrit :
On 20/05/2020 15:24, Ahmed Charles wrote:
>
>
>> On May 19, 2020, at 5:09 AM, Lorenzo Donati
>> <lorenzodonatibz@tiscali.it> wrote:

[snip]

>>
>> Otherwise a user could not tell if the built-in generator is good
>> for their application, bar knowing it's no good for crypto stuff.
>
> I think that if you’re wanting to display a random quote from a list
> (like fortune), the current info is just fine. And if what you want
> is more complex than that, you probably want to pick a specific
> solution or at least read the source. I don’t think Lua benefits much
> from over specifying the implementation details of random number
> generation.
>
> Ideally the source would have comments about the algorithm with links
> to more info, but the manual is better of being vague here, in my
> opinion.
>

The problem here is that I think the manual is giving either too much or
too little information.

If you want to completely hide implementation details you don't state:

"The results from this function have good statistical qualities,"

because this sentence has meaning only for people who /do/ want some
more info about the quality of implementation.

So, IMO, you either expunge that reference about quality altogheter, or
provide at least some /precise/ reference. And I don't think people
/using/ Lua should all be able to search for that info in the C source.

BTW, Lua could be embedded in an application who only provide a link to
the user manual. A interested user (which could not even be a
programmer) should be able to "follow the link" and discover at least
the name of the algorithm used, without the need to even think of
downloading the source just to see if they can find something somewhere
in a cryptic C file comment!

Otherwise, as I said, just avoid mentioning the quality issue at all,
because as it stands that sentence is useless except for a very narrow
category of people, i.e. Lua developers that are also expert C
programmers AND that have interest in random number generator quality.

Anyway, I would much prefer that little bit of added information. After
all it is Lua /Reference/ Manual, it should give references to a Lua
programmer/user without forcing him to learn C.

-- Lorenzo