[5152] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

Re: depleting the random number generator

daemon@ATHENA.MIT.EDU (David Honig)
Mon Jul 19 22:59:51 1999

Date: Mon, 19 Jul 1999 09:05:58 -0700
To: bram <bram@gawth.com>, cryptography@c2.net
From: David Honig <honig@sprynet.com>
In-Reply-To: <Pine.LNX.4.04.9907181700580.902-100000@ultra.gawth.com>


Bram, the *nix /dev/random code *does* accumulate a pool of 'physical'
(interrupt, interrupt timing) entropy and stirs and extracts bits
via a crypto
secure hash (e.g., MD5 in FreeBSD).  And you can easily expand this
pool by modifying the code.  But the pool is always finite; therefore
depleteable.   Note that you can't tell whether you've run out
of entropy by examining the output of /dev/Urandom, since its 
a strong PRNG.  Yarrow doesn't change this.

And yes, you *never* use raw measurements as random bits without
conditioning.


At 05:08 PM 7/18/99 -0700, bram wrote:
>On Sun, 18 Jul 1999, Bill Stewart wrote:
>
>> /dev/urandom will give you pseudo-random bits if it's run out of entropy,
>> so you've got the security risks inherent in that.  
>> As David Honig points out, you can't avoid those alternatives,
>
>Yes you can, if there's a 'pool' of entropy in memory which contains a
>cryptographycally large number of bits and it's both mixed and extracted
>from in a cryptographically secure way then the need for constant
>reseeding is eliminated, although it's still helpful. The paper on Yarrow
>explains the threat model pretty well -
>http://www.counterpane.com/yarrow.html
>
>> so if you need the high quality randomness, you need hardware randomizers.
>
>Those are helpful as well, but should still never be used in the raw -
>their entropy output should be estimated conservatively and fed into a
>reseedable PRNG.
>
>-Bram
>
>
>




home help back first fref pref prev next nref lref last post