[onerng talk] review of RNGs

Bill Cox waywardgeek at gmail.com
Wed Jul 8 18:43:22 BST 2015


On Wed, Jul 8, 2015 at 2:47 AM, phred53 <phred53 at hotmail.com> wrote:

> I don't pretend to really understand but I couldn't see how CRC16, or for
> that matter any deterministic algorithm, "post processing" of the random
> data could _increase_ entropy,, but ent said it did. In addition I can't
> help but be suspicious of any "improvements" in dieharder test results
> after whitening; didn't seem to be the case for OneRNG.
>

There is no way to increase entropy as a post-process.  However, you can
squeeze out all the predictability out of a TRNG generated bit stream, if
you fully compress it knowing how the TRNG works, creating a true random
stream.  Cryptographic whitening with a CPRNG is then needed only to
confound attackers who may find correlations in your compressed data that
you did not know about.  Since we need cryptographic whitening anyway, we
might as well skip the compression step and let the CPRNG compress the
stream at the right compression ratio for us.

In my driver, I output no more bits from the CPRING than the amount of
entropy provably written to it, after dividing by a small safety margin
(1.03) to account for component tolerances (1%) and the small non-ideal
behavior of the TRNG (typically < 0.2%).  This compresses out all the
predictability while cryptographically whitening the stream.

Any TRNG that simply whitens a bit stream of n bits to generate n bits is
dangerous.  Any non-open-source TRNG probably does this, as it allows them
to advertise a higher bit-rate of random data then they actually generate,
which means they make more money.  In fact, they can cheat and have a PRNG
output N*entropy, and no one will ever know the difference, except maybe an
attacker who reverse-engineers the device and learns how to PWN your
cryptographic keys.  This is why being open-source, like OneRNG, is so
important.


> There is a Crypto StackExchange discussion about entropy here -
> http://crypto.stackexchange.com/questions/10404/estimating-bits-of-entropy
>
> which appears to indicate that entropy is really about the physics of the
> device; measuring a bit stream doesn't tell one the true entropy and I
> would add particularly if whitened.
>

The device will operate according to the physics that creates the entropy
in the first place.  I prefer thermal-noise, since this is provably
present, with no way to get rid of it.  The more common way is to use
avalanche-noise from a zener-diode, but the entropy generated varies from
device to device, and with both temperature and age of the device.  This is
one reason why good zener TRNGs use 2 zener sources rather than one.

My conclusion is that OneRNG's entropy is what you basically state and what
> Shanon Entropy (ent) claims for raw mode bit streams. Assuming my capture
> utility set the mode and captured data correctly these are the Shanon
> entropy (ent) for 15MB files from the three raw modes -
>

Ent is probably pretty good at estimating randomness from raw output of
OneRNG zener source at a time.  It looks like you measured about 6.6 bits
of estimated entropy per byte, with very low serial correlation.  This
means the OneRNG is probably waiting long enough between samples for Ent to
do a decent job at estimating entropy.

I would be interested in testing OneRNG output.  I use an entropy estimator
that does a much better job of compressing out correlations of nearby bits,
and so far, it always outperforms ent.  I've been able to prove ent
over-estimates entropy by as much as 15% for some TRNG output.  Could you
send me a copy of your raw data?

With a zener source like OneRNG, and ent's common over-estimation of
entropy, I'd use a safety margin on the compression rate in the CPRNG of at
least 25%, just to be safe.

One thing I would _not_ do is use data from the OneRNG or any other TRNG to
generate cryptographic keys or nounces before cryptographically whitening
the output.  I'd use it to feed/dev/random instead.  With it's CRC mixing
of two streams, each of which probably has at least 0.5 bits of entropy,
the bits fed into /dev/random probably have close to 1 bit of entropy, and
with cryptographic whitening, you should be in good shape.

Bill
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ourshack.com/pipermail/discuss/attachments/20150708/cc75f121/attachment.html>


More information about the Discuss mailing list