[1106] in WWW Security List Archive

home help back first fref pref prev next nref lref last post

Re: Unix links subverting Web security

daemon@ATHENA.MIT.EDU (John Franks)
Thu Nov 2 17:59:43 1995

From: John Franks <john@math.nwu.edu>
To: zhul@cs.uregina.ca (Lianyi Zhu)
Date: Thu, 2 Nov 1995 07:54:17 -0600 (CST)
Cc: boyken@cs.uiowa.edu, www-security@ns2.rutgers.edu,
        lstein@genome.wi.mit.edu
In-Reply-To: <Pine.SGI.3.91.951101081938.10275A-100000@HERCULES.CS.UREGINA.CA> from "Lianyi Zhu" at Nov 1, 95 08:24:41 am
Errors-To: owner-www-security@ns2.rutgers.edu

  >>Don't forget that remote users can view .htaccess with ease just by asking
  >>for the URL!
  >>
  >>        http://your-site/.htaccess
  >
  >No, you have 2 different directories for documents (def: htdocs) and
  >conf (def: conf)  -  at least with ncsa-httpd and derivates
  
 > Yes, this is the better way to do it, but a lot of people use the alternate
 > > per-directory file method.
 > > 

This is really only a special case of a general problem (at least I would 
consider it a problem).  That is the fact that most servers make it the
default to grant access to any file in the hierarchy.  Many files which
shouldn't be accessible often are.  When you see a URL like

	http://host/path/whatever.cgi

you can try to retrieve http://host/path/whatever.cgi~
My guess is that this will often succeed in providing (a slightly old)
CGI source file.  This problem is even worse if the server is setup
to display an index of every directory.  Then any file the maintainer
accidentally leaves around is accessible.

For a server whose default is to deny access check out WN.

	http://hopf.math.nwu.edu



John Franks

home help back first fref pref prev next nref lref last post