[16021] in cryptography@c2.net mail archive

home help back first fref pref prev next nref lref last post

system reliability -- Re: titles

daemon@ATHENA.MIT.EDU (Ed Gerck)
Sat Aug 28 22:08:43 2004

X-Original-To: cryptography@metzdowd.com
X-Original-To: cryptography@metzdowd.com
Date: Fri, 27 Aug 2004 00:12:57 -0700
From: Ed Gerck <egerck@nma.com>
To: cryptography@metzdowd.com
In-Reply-To: <3.0.5.32.20040826170735.00973100@pop.west.cox.net>
X-Rcpt-To: <cryptography@metzdowd.com>


David Honig wrote:
> "Applications can't be any more secure than their
> operating system." -Bram Cohen

That sounds cute but I believe it is incorrect. Example: error-
correcting codes. The theory of error-correcting codes allows
information to be coded so that it can be recovered even after
significant corruption. This allows, for example, for
_secret-sharing_ with multiple systems so that no operating
system platform has enough information or enough power to even
allow a compromise. Such an application can be much more secure
than any operating system supporting it.

RAID is another example of a realiable system that is made out
of unreliable parts.

The human application of these principles is well-known in
information security and also supports the examples above. Humans
are notorious for breaking security systems. Humans are the
wetware equivalent of an operating system. A common solution for
the risk presented by humans is also _secret-sharing_: No person
may have access to classified information unless the person has
the appropriate security clearance and a need-to-know.

What this means is that the search for the "perfect" operating
system as the solution to security is backwards.

Cheers,
Ed Gerck

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to majordomo@metzdowd.com

home help back first fref pref prev next nref lref last post