[onerng talk] review of RNGs

ianG iang at iang.org
Wed Jul 8 13:15:21 BST 2015


There are many difficulties here and it takes a while to unravel them. 
If you want to get deep into it, the goto place is John Denker's stuff.

Entropy is the opposite of information.  What we know isn't entropy, and 
what we don't know is entropy.  We can measure information but we can't 
by definition measure entropy because if we look at it, it becomes 
information.  However we can *estimate* entropy by measuring the 
information content.  Eg. found in a stream of bits.

So, when you put an RNG into a test like that, the information 
measurement goes up and down depending.  Adding a CRC16 or any 
deterministic hash style function on the end will reduce the amount of 
information that it can extract.  Hence its estimate of entropy goes up.

But it's important to realise that the tool is being tricked.  It can't 
see the real RNG behind as much, it sees less information.  So either 
you or it are tricked by reporting "more" entropy when actually it's 
reporting less information.

Which now lays bare the paucity of these measurement tools.  All they 
really tell us is the quality of the expansion algorithm at the end of 
the chain.  Eg., a perfect expansion algorithm is ChaCha20 with a 
password of "entropy" because there is no tool that can extract any 
information from the output, yet we know the entropy is zero.

This is why we (me?) originally wanted the tool to be unwhitened / 
unexpanded.  E.g., Intel's RNG came under a lot of criticism because we 
couldn't see past the expander, it was "perfect zero information" and we 
had no idea whether it was RC6("NSA-secret") or good physical entropy.

Unfortunately, the market for RNGs grew up on a diet of steampunk ideas 
that we want only absolutely perfect always entropy for passwords and 
RSA keys.  When in fact what we want is enemy-surprisal, being bits of 
data that the enemy can't predict.  It turns out working with pure 
entropy in large quantities is very hard ... whereas working with small 
numbers of entropy and a good expansion function which deliver enemy 
surprisal is tractable.

In the OneRNG case, we need good entropy (low information), low speed, 
pumped into say /dev/random.  Then the output of /dev/random is fine.

Dieharder and ent are actually false sirens.  But marketing tools 
nonetheless, we're captured by our own myths.



iang


On 8/07/2015 11:30 am, Paul Campbell wrote:
> On Wed, 08 Jul 2015 21:47:44 phred53 wrote:
>> I don't pretend to really understand but I couldn't see how CRC16, or for
>> that matter any deterministic algorithm, "post processing" of the random
>> data could _increase_ entropy,, but ent said it did.
>
> this is essentially the same observation I was making - I think it sort of
> calls into question the entropy number
>
> On the other hand when I changed the sampling algorithm in OneRNG to pull more
> sampled bits/byte and to sample each bit more in time I did see ent's measure
> of our generated entropy pop up from ~7.7 to ~7.9 bits per byte - so it
> increases as I expected
>
>> In addition I can't
>> help but be suspicious of any "improvements" in dieharder test results
>> after whitening; didn't seem to be the case for OneRNG.
>
> on the other hand this is a different issue - running entropy data through a
> CSPRNG will make something that's more likely to pass dieharder or pass it
> better - whitening has lots of forms, ours (CRC16 which makes an OK 16-bit RNG
> - as OK as you can get in 16-bits) has this effect just not as good
>
> (I like to think of CRC16 as smearing the entropy together in time, kind of
> smoothing it out)
>
>> which appears to indicate that entropy is really about the physics of the
> device; measuring a bit stream doesn't tell one the true entropy and I would
> add particularly if whitened.
>
> well think about a software RNG with 100 bits of internal state initialised at
> start time - you can extract 1Gb of data and dieharder may love it, but it
> will still only have 100 bits of entropy in this sense - OneRNG is sort of the
> opposite: lots of real entropy, but maybe not the perfect RNG
>
> 	Paul
>
> ――
> View topic http://lists.onerng.info/r/topic/5n9SjG8ECdw8GWwquix0qE
> Leave group mailto:onerng-talk at lists.onerng.info?Subject=Unsubscribe
>
> Start groups https://OnlineGroups.net
>



More information about the Discuss mailing list