[136056] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

Re: combining entropy

daemon@ATHENA.MIT.EDU (Stephan Neuhaus)
Fri Oct 24 15:41:16 2008

Cc: IanG <iang@systemics.com>, Cryptography <cryptography@metzdowd.com>
From: Stephan Neuhaus <neuhaus@st.cs.uni-sb.de>
To: John Denker <jsd@av8n.com>
In-Reply-To: <4901BFB0.2080403@av8n.com>
Date: Fri, 24 Oct 2008 15:37:45 +0200


On Oct 24, 2008, at 14:29, John Denker wrote:

> On 09/29/2008 05:13 AM, IanG wrote:
>> My assumptions are:
>>
>> * I trust no single source of Random Numbers.
>> * I trust at least one source of all the sources.
>> * no particular difficulty with lossy combination.
>
>
>> If I have N pools of entropy (all same size X) and I pool them
>> together with XOR, is that as good as it gets?
>
> Yes.
>
> The second assumption suffices to prove the result,
> since (random bit) XOR (anything) is random.

Ah, but for this to hold, you will also have to assume that the N  
pools are all independent.  If they are not, you cannot even guarantee  
one single bit of "entropy" (whatever that is).  For example, if N =  
2, your trusted source is pool 1, and I can read pool 1 and control  
pool 2, I set pool 2 = pool 1, and all you get is zeros. And that  
surely does not contain X bits of "entropy" for any reasonable  
definition of "entropy".

Fun,

Stephan

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo@metzdowd.com

home help back first fref pref prev next nref lref last post