[onerng talk] review of RNGs

Bill Cox waywardgeek at gmail.com
Wed Jul 8 14:38:43 BST 2015


For any given TRNG, if we understand it's theoretical operation, and can
model the real devices well, we can accurately estimate the entropy coming
from the device.  This is key for monitoring it's health.

There is no such thing as an unbiased TRNG.  Any TRNG that passes the
Dieharder tests directly is cheating by "whitening" bits before you see
them.  If the manufacturer does not specify how this is done, you should be
worried!

"Entropy" per bit is a term we abuse here, but unfortunately, it's
widespread.  It simply means the log base 2 of one over the probability of
being bit's value:

    E(bit) = log2(1/P(bit))

The entropy of a sequence is the sum of the entropies of the bit.  The
outcome of a fair coin has probability 1/2.  For a coin flip, the entropy
of heads is 1 bit, as is the entropy for tails, so it's always 1 bit.

Measuring bits of entropy of a sequence can be done accurately given an
accurate model for the probability of the next bit given all the previous
bits.  For a biased coin, we can do this easily.  For example, what do we
get when a biased coin has probability of coming up heads P(H) = 1/4, and
we see the sequence HH?

    P(H) = 1/4
    P(HH) = 1/4 * 1/4 = 1/16
    E(HH) = log2(1/(1/16)) = log2(16) = 4

The entropy in TT is lower:

    P(T) = 3/4
    P(TT) = 9/16
    E(TT) = log2(16/9) = 4 - 2*log2(3) = 0.839

We can add bits of entropy for independent outcomes.  So, E(HH) = E(H) +
E(H).  For a fair coin, this is just 1 + 1 = 2.  For our biased coin above,
it's 2 + 2 = 4.  For the biased coin, E(TT) = E(T) + E(T) = log2(4/3) +
log2(4/3) = log2(4) - log2(3) + log2(4) - log2(3) = 2 - log2(3) 2 - log2(3)
= 4 - 2*log2(3) = 0.839.

We can use this same mathematical model for entropy for correlated
outcomes.  For example, if the probability of the next coin flip being the
same as the previous is 1/4, and the probability of heads or tails is even
for the first flip, then P(HH) = 1/2 * 1/4 = 1/8.  This would be 3 bits of
entropy.

This model works _exactly_ for any sequence of bits from _any_ TRNG.  How
cool is that?  All you need is the _exact_ model of operation that
describes the probability of the next bit being 1, given all prior history
of output bits.

For any given TRNG, if we understand it's theoretical operation well, and
can model the real devices well, we can accurately estimate the entropy
coming from the device.  For a TRNG which is a true coin flip, the answer
is 1 bit per flip.  For an Infinite Noise Multiplier with a gain between 1
and 2, the entropy per bit is log2(gain).  This enables a simple way to
monitor the health of the TRNG.  If the estimated entropy deviates
significantly from the expected entropy per bit for a long sequence, then
there's probably something wrong.

A serious limitation of most other TRNGs is that the entropy per bit is
highly dependent on process variation, temperature, and the cycle of the
moon ;-)   However, with care, you can still do the health monitoring
trick.  For example, with the OneRNG, which uses finicky zener avalanche
noise, you get high correlation between outputs if they are measured with
low time between samples, but if there should be little correlation if the
time between samples is long.  We can model the output for long delays
between samples as a biased coin, produce a bunch of bits with high delay,
and estimate it's entropy using the biased coin model.  This number should
not change rapidly over time, which is something we can monitor, and we can
raise a red flag if there is a sudden change.

As for post-process whitening, it's a very good thing.  For example, we can
convert a sequence of highly biased coin flips, where the coin flips
generate only 1/2 bit per flip, into a sequence of entirely random bits 1/2
the length.  Simply feed the sequence into a CPRNG, and wham - you get true
random bits on the output, at 1/2 the rate.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.ourshack.com/pipermail/discuss/attachments/20150708/1fd7c7dd/attachment.html>


More information about the Discuss mailing list