Web Server Security
CS 6204 - Java and the WWW
Marc Abrams
References:
-
Lincoln D. Stein, How to Set Up and Maintain a WWW Site, Addison-Wesley,
1995, Ch. 3, 4
-
Nancy J. Yeager and Robert E. McGrath, Web Server Technology, Morgan
Kaufmann, 1996, Ch. 7
-
The
World Wide Web Security FAQ:
Web Assets and Risks
Discussion summarizes Yeager and McGrath, section 7.1.
Assets to Protect
-
Web documents - some are confidential
-
Web browser, server, and proxy machine resources (CPU, memory, ...)
-
Access to Web-accessible executable programs (i.e., scripts)
-
Network bandwidth
Risks
Vulnerable points in Web:
-
Network access points (local area network, dial up lines, Internet itself)
-
Configuration, design of operating sytem on browser and server machines
-
Configuration, design of browsers and servers
Common Web Security Mechanisms
-
OS and network security mechanisms (outside this course)
-
Proper Web server configurations
-
Authentication and authorization mechanisms for Web service
-
Logging and monitoring
-
Firewalls
-
"Sandbox" model for Java applets
-
"Taint" option for perl scripts
-
Physical isolation of an intranet
Threats to Internet Security
-
Cracking passwords
-
"Sniffing" passwords by eavesdropping on networks (particularly Ethernet)
- possibly the number one threat to Internet hosts today.
-
"Spoofing" hosts
Threats to Web Server Software
Compare Web security to UNIX sendmail security.
sendmail...
-
grants access to everyone
-
is one large program of tens of thousands of lines
-
is complex to configure; mistakes open security holes
-
must run with superuser or root privilege
In comparison, Web servers...
-
grant access by default to all; access restrictions are optional. So Web
servers are vulnerable to denial-of-service attacks due to a flood
of requests from an attacker.
-
are large, new, and untested programs. Servers can be 60,000 lines of code
or larger, surely with undiscovered bugs. Example:
NCSA's Web server was suceptible to old Internet bug: Several
UNIX C library routines don't check whether they overrun memory buffer
areas, permitting sophistocated programmer to put instructions outside
the routine's memory area, which are later executed. Was basis of "Internet
worm" in 1988, which exploited finger server daemon flaws.
-
are complex to configure. For example, some sites put the perl language
interpreter in the cgi-bin directory, allowing any Internet user
to execute an arbitrary perl program on the server!
-
require root access at least to connect to port 80. Sometimes Web administrators
let their servers run with root privledge all the time, violating the "principle
of least privilege." CGI-bin programs executed then had root privledge
by default! Instead, servers should start with root privledge but then
change to user nobody before servicing document requests from browsers.
There one additional problem in Web servers, with no parallel in sendmail,
namely that every Web script (e.g., scripts that invoke external commands,
perhaps through system call) on the server is a potential security
hole.
Securing a Web Server
-
Carefully read configuration details for the server that you are installing.
-
Turn off all non-essential features (e.g., server-side includes).
-
Limit access to cgi-bin scripts. Ideally, all programs should be kept in
one area. (Users hate this, but it's a big potential security hole.)
-
Isolate system services: Use separate servers for separate services - Web,
mail, WAIS, etc. Then a security breach is (possibly) limited to one service.
-
On UNIX systems, run Web daemon as chroot, which changes root
directory for Web process and its children. So root "/" directory is inaccessible,
and Web server runs in its own protected environment.
-
Monitor Web server access logs and logs of login attempts, looking for
suspicious access patterns.
-
Mount Web documents as read-only. For example, you could NFS mount a remote
file system on the Web server as read-only. The remote machine only allows
writes to the files, so document authors don't use the Web server. However,
NFS mounting will reduce performance.
Restricting Access to Documents
Web servers use access control lists to enforce authorization to
use Web documents.
In NCSA's httpd, user can create .htaccess file to control
access:
<Limit GET>
order deny,allow
deny from all
allow from .cs.vt.edu
allow from .ee.vt.edu
allow from 128.173.40.105
</Limit>
The .htaccess file above only allows access either from hostnames that
either end in .cs.vt.edu or .ee.vt.edu or from the host with IP address
128.173.40.105.
The lines are processed in order of appearance. Thus the "deny from
all" will first prohibit access to any Internet host, and the subsequent
"allow from" lines override the "deny from all."
How Password Protected Web Pages Work
NCSA's httpd uses basic authentication: use a simple password
for authentication, and restrict document access according to user name,
group membership, or user agent IP address. Example (for NCSA httpd,
in .htaccess file for a directory) [from Stein, p. 141]:
AuthName Saturn
AuthType Basic
AuthUserFile /usr/local/etc/httpd/conf/passwd
AuthGroupFile /usr/local/etc/httpd/conf/group
<Limit GET>
require user huey dewey louie
require group web-maintainers
</Limit>
The .htaccess file above says:
-
Docs in the directory containing the .htaccess file can only be accessed
-
by users huey, dewey, and louie if they are in group web-maintainers (defined
in /usr/local/etc/httpd/conf/group)
-
if a valid password is given, as listed in file /usr/local/etc/httpd/conf/passwd
When GET arrives for document with restricted access:
-
server looks for Authorization field. If absent, return code 401
(Unauthorized). (See p. 294 in Yeager and McGrath for example.)
-
Browser will then prompt user for user name and password. Browser then
uses uuencode algorithm to scramble "user:password",
and issues new GET for document with Authorization header field
set to scrambled characters.
-
Web server receives GET and unscrambles "user:password" with uudecode
algorithm. Server returns document or, if user:password is illegal, code
403 (Forbidden).
-
Browser reuses password each time page is re-requested during browser session,
so that user need not enter user id and password every time page
is redisplayed.
Password Files:
Web server maintains its own password file, separate from host password
file. For example, in NCSA's httpd, use
htpasswd [-c] password_file user
to create a password file (if -c is present) or to add a user to the password
file. The command prompts you for the password. The htpasswd program
is in httpd's support directory.
Example:
% htpasswd -c /usr/local/etc/httpd/conf/passwd dave
Adding passwordfor dave.
New password: *****
Re-type new password: *****
Doing the above produces in file /usr/local/etc/httpd/conf/passwd something
like:
The password is transformed to the string "NVA3234NIITjij" by applying
the Unix function crypt, defined in unistd.h (e.g., /usr/include/unistd.h).
Authentication Types (AuthType field):
The above is basic authentication.
There are alternatives to Basic authentication that use cryptography,
so a sniffer cannot capture a uuencoded password and reuse it. (One is
public key encryption. Another is to use passwords that are good for only
one use, along with a card that a user possesses.)
Group Files:
A group file is created with a text editor. Each line consists of a group
name followed by a colon followed by a comma separated list of user names:
web-maintainers: sally,fred
admin: otto,loise
Return to CS6204 home page.
Last modified on 30 September 1999.
Send comments to abrams@vt.edu.
[This is http://ei.cs.vt.edu/~jwww/courseNotes/server-sec.html.]