[4254] in WWW Security List Archive

home help back first fref pref prev next nref lref last post

RE: Sceptic about (Funds Transfer w/o PIN)

daemon@ATHENA.MIT.EDU (Brian Toole)
Wed Feb 5 19:59:05 1997

From: Brian Toole <btoole@oakmanor.com>
To: WWW-SECURITY@ns2.rutgers.edu
Date: Tue, 4 Feb 1997 20:25:09 -0500
Errors-To: owner-www-security@ns2.rutgers.edu

I don't think the real issue here is the 
motivation of the hack. The real issue is the
fact that it (theoretically) has been demonstrated
that the mere presence of a piece of signed code
does not imply that an application is secure.

The only "trick" here is to lure the user into 
downloading the application, and in this case, having
a certificate actually helps the process, rather
than hindering it. "Oooh. It's signed, so it
is safe to use."

In the Quicken example, consider any company
which wants to market something on the web.
They are a legit company, actually selling products.

They hires a contractor to do their web site. 

The vendor knows squat about the web, or computers for that
matter. The contractor on the other hand, recognize
an opportunity to collect data and trojans these 
routines into the web site.

"Hey Mr. Vendor, your web site sucks. How 'bout 
you let me fix it for you. I'll give you a 
good price! Look, we can even enhance your security
by using Authenticode! All you have to do is fill
out this VeriSign form..." 

The contractor makes the order form an ActiveX applet,
and signs it with the vendors certificate.

Once it's downloaded,(and under the right conditions), 
it can poke around and silently send back anything that 
looks interesting, any time you're on line. (and it
takes orders as well...) 

(Hey, doodz, I've got lots of personal data on the 
CEO of company X, anybody want to trade some warez ?)

Far fetched ? Maybe. But can you be sure ?

Digital Signatures (Authenticode) provides little 
real assurance of a "safe" application when the
certificate is not granted on the basis of the
object being certified. This is a real problem
when the object does not operate in a constrained
environment (sandbox) when executed by the browser.

To quote Microsoft KB Article Q155444,

"Digital Code Signing is a standard used to help 
ensure that content and files you download from
the Internet have come from trusted sources 
and have not been altered since their creation. 
This also provides additional protection against
computer viruses. 

...

Digital Code Signing can be used for downloadable
files and ActiveX Web page content. 

When a file or an ActiveX page is downloaded,
a check is performed to determine if the file
or content was signed by the publisher and 
whether or not the certificate used to sign 
the code is valid. You are prompted to continue
downloading the file based on the information 
that is displayed."

Some more information can be found at
http://www.microsoft.com/kb/articles/q159/8/93.htm

Notice, it does NOT guarantee the app is
secure, well written, or not malicious, it
merely says that the app has been written
by a company that passes the rules for
obtaining a SPC (Software Publishing Certificate).

As a matter of fact, 
from (http://microsoft.com/intdev/security/misf8-f.htm)

"Based on Microsoft code signing program criteria, 
VeriSign will attempt to verify that your company 
meets a minimum financial stability level using 
ratings from Dun & Bradstreet Financial Services. 
Your certificate will indicate if you have met this level.

Some software, such as the Microsoft Internet 
Explorer 3.0, offers end-users an option to 
bypass making an explicit choice to trust code
from each new software publisher. If an end-user 
checks an option to trust all software signed by 
vendors who have met the financial criteria, 
code signed by these vendors will be run without 
any user intervention."

The last time I checked, security had little to
do with such financial criteria.

There has been a lot of discussion here about
the abdication of security policy decisions to
end-users who are not really qualified to make
those decisions, and of the course granularity 
(site or vendor) of the access controls. 

Additionally, the information presented to the 
end-user to make this choice is not sufficiently 
detailed to make an informed selection. How the
heck is an average user supposed to know if
this "blob" of stuff won't do something "bad"
based solely on the financial rating of the
company that provided it ?

And as some others have mentioned, what happens
when a trustworthy vendor "looses" their certificate
somehow ? 

What it boils down to:

1. EDUCATE. EDUCATE. EDUCATE. End users
must be aware of what can happen. Managers
need to understand the risks inherent in
the technology they are using.

A certificate means nothing. It
offers no guarantee that the application is
not malicious, it merely says that it's 
author is coordinated enough to get the
application signed.

In the example I cited above, the owner
of the certificate may be unaware that their
applet is malicious or trojaned.

Malicious does not imply visibility. If
all I want is information off of your machine,
I never have to do anything that makes you
aware of this, such as deleting files or
altering your data in other ways.

A malicious applet may not be considered
malicious on it's own, but could play a part
in a larger coordinated attack on a site or
individual.

2. Block ActiveX from the public network.
	
I don't see how anyone charged with a corporate
security policy could let ActiveX through their
border, and still feel comfortable about it,
especially into a population of W95 or WFW 
clients.

If someone want's to debate this, I'm more than
interested in hearing ways this could be made
safe enough to do in a publicly traded company,
where the stockholders can sue the pants off 
you for not taking "prudent measures" to protect
corporate information.

3. Review all internal ActiveX components.

ActiveX can be really useful on a corporate 
intranet, where you are essentially trusting
your own developers, or components that you
have purchased and tested to be used in a
internal information system.

That being said, you DO have to test them,
including code reviews for internally developed
software. Disgruntled employees and cheesy
contractors could do a lot of damage. This
is nothing new, and we've always had to do this 
with mission critical apps. 

However, just because an ActiveX control is 
designed to do one thing, it doesn't mean 
it can't also do something else on the side, 
or have unintended side effects.

--Brian







home help back first fref pref prev next nref lref last post