[107837] in Cypherpunks
Re: Redundancy and Randomness [Information Theory]
daemon@ATHENA.MIT.EDU (Mok-Kong Shen)
Mon Jan 25 08:26:28 1999
Date: Mon, 25 Jan 1999 14:05:30 +0100
From: Mok-Kong Shen <mok-kong.shen@stud.uni-muenchen.de>
To: mgraffam@idsi.net
CC: cypherpunks@toad.com
Reply-To: Mok-Kong Shen <mok-kong.shen@stud.uni-muenchen.de>
mgraffam@idsi.net wrote:
>
> I've been working on some stuff lately, and it occurred to me that
> it hinges on the assumption that if a message has 0 redundancy, then it
> consists of essentially random bits.
>
> I'm not comfortable with this assumption intuitively, so I was wondering
> if anyone knows of a proof that such a message must be random, or can
> think of an example where this would not be the case.
>
> I'm thinking of compression.
>
> An ideal compression algorithm that compressed a message down to its
> essential entropy would have 0 redundancy (by definition) .. and while
> it would not be 'philosophically random' (how could a deterministic
> algorithm get randomness from ordered bits?) it seems that such a
> compression function's output would be indistinguishable from true
> randomness.
I guess that there could be no (practically sensible) answer to the
issue, because there is no algorithm to prove any arbitrarily given
string has 0 redundancy and there is no algorithm to prove any
arbitrarily given string is truely random.
M. K. Shen