[7563] in bugtraq

home help back first fref pref prev next nref lref last post

Re: Object tag crashes Internet Explorer 4.0

daemon@ATHENA.MIT.EDU (Roger Espel Llima)
Fri Aug 7 16:49:53 1998

Mail-Followup-To: Paul Leach <paulle@MICROSOFT.COM>, BUGTRAQ@netspace.org
Date: 	Thu, 6 Aug 1998 13:49:33 -0400
Reply-To: Roger Espel Llima <espel@IAGORA.COM>
From: Roger Espel Llima <espel@IAGORA.COM>
X-To:         Paul Leach <paulle@MICROSOFT.COM>
To: BUGTRAQ@NETSPACE.ORG
In-Reply-To:  <CB6657D3A5E0D111A97700805FFE65875D73F9@red-msg-51.dns.microsoft.com>; from Paul Leach on Thu, Aug 06,
              1998 at 01:53:25AM -0700

On Thu, Aug 06, 1998 at 01:53:25AM -0700, Paul Leach wrote:
> As a result, I just didn't care about the precise problem reported, and was
> commenting on the problem of "bad" web pages in general. If we started
> examining web pages to analyze them and catch "bad" ones before we executed
> them, it is indeed true we could catch many bad ones. However, every one we
> don't catch would be a "YET ANOTHER MAJOR MS SECURITY HOLE", and the theory
> tells us we can't catch all of them. So, we're just not going to start down
> that path. If a site has pages that cause your browser to restart, don't go
> there again; set your Zones to stop you if you really want. No serious site
> has any interest in allowing such pages to exist on its site, and about all
> you lose when the browser restarts is the history list, since it's about as
> stateless as you can get in an app (except for its config data, which isn't
> lost anyway).

It's nice to see that Microsoft has such a positive attitude to security.
If a site crashes your browser, don't go there again!  Is that anything
like "If a program crashes windows, don't run it!"?

There was a very reasonable message on bugtraq just a few hours ago,
that explained it clearly enough: webpage contents need to be treated
as _non-trusted_ user input.  Web pages MUST be assumed to be hostile,
because you have no control over who wrote them and what their intentions
were.

The whole point of the WWW is that anyone (not just "trusted" sites,
that can't really be trusted anyway - big established sites have been
hacked before), anyone can publish contents, which is _safe_ to view,
because it's basically text, graphics, sound, etc.  Text, graphics or
sound are not supposed to be able to bite you in any way.  You don't
need to trust a book's author to read it; you can drop it if you're not
interested, but it can't harm you just by reading it.

Now, browsers have been extended in many ways, including support for
javascript, cookies, client pull, server push, java, vbscript, etc,
allowing the server to have some control on the client, which is right
and extremely useful, as long as the user has _ultimate control_.

Turning enough untrusted things off, it should be possible to view any
site safely, knowing that the worst that can happen is that you don't
like or can't make sense of the contents, but without disrupting your
experience in any way.

If you need to trust a site to be safe viewing it with some browser,
and that is part of the design of the browser as you seem to imply
(as opposed to a bug, which can always be fixed, and can happen to any
software), then I will NEVER even consider using that browser.

--
Roger Espel Llima, espel@llaic.u-clermont1.fr                     -o)
http://www.eleves.ens.fr:8080/home/espel/index.html               /\\
                                                                 _\_v

home help back first fref pref prev next nref lref last post