Approach to Hardening Web Servers
Attack surface reduction at a system level
Publicly accessible web servers receive requests from both legitimate and malicious users. It is important to recognize them both and take appropriate actions to process only those requests that should be processed, and serve only those content that should be served.
In simpler terms, there are two distinct types of attacks against web servers, one that is targeted towards an organization or a domain, and other that is non-targeted which attempts to compromise any publicly accessible web server. A targeted attack might compromise the web application hosted, rather than the server software itself.
When we talk about non targeted, the attack vector typically involves exploiting known vulnerabilities, misconfigurations and using openly available information about the web server software. Hardening the server makes it very difficult for the attacker to compromise the entire system, and limits the progression of the attack.
At a high level, hardening is about limiting the capabilities of the web server and the operating system. The web server might have features that may not be relevant to the deployment and could be turned off. This may mean modifying the default configuration files, loaded modules, permissions of files and directories, and so on. On the operating system side, by default, there will be installed tools and enabled services that are not required in a production environment. They should be removed/disabled.
Hardening web servers will not protect an organization from targeted attacks. However, it is the first step towards it and will protect the web server from non-targeted attacks.
A few reasons to restrict capabilities of the operating system is to deny the attacker the ability to execute custom code, create a reverse shell or escalate privileges. Do note, once the web server process is exploited, other tools on the system could be exploited for the next phase of the attack. Hence, it is important to remove all tools and services that are not required.
Make sure to run the web server process as a restricted user, with limited access to the system. Typically, the restricted user will need write access to log files, and by default also have write access to temporary directories. To avoid executables in temporary directories, they can be mounted with noexec and nosuid options. This doesn’t limit scripts from being executed, though. But, it will be possible in the near future. There is ongoing work to control execution of file contents.
However, the best way to isolate the process is by way of changing root directory and using kernel namespaces, with mount and user namespaces in particular. While mount namespace changes the view of the filesystem mount points to the processes, the user namespace maps all UIDs including 0 (root) to a range of unprivileged UIDs outside the namespace. With systemd parameters, RootDirectory=, MountAPIVFS= and PrivateUsers=, these can be achieved.
Just as any software, web server software are constantly updated with new features, bug fixes and security fixes. Hence, to make the task harder for the attacker, it is important to not leak any information about the type or version of the web server software used. It is not practical for the attacker to attempt every known exploit on web servers.
Most web server software are modular and support loadable or statically compiled modules. By loading only required modules, the attack surface reduces significantly. This combined with a tailored configuration, such disabling directory listing, file serving, welcome message, error documents and so on, the chance of compromise becomes minimal.
Scanning and Testing
Even with a hardened server, it is important to test the defenses periodically. A vulnerability scanning system can check the web servers whether they are vulnerable, misconfigured or outdated. While regular penetration testing can help in identifying gaps in defenses.
However, vulnerability scanners have some significant weaknesses. Generally, they identify only surface vulnerabilities and are unable to address the overall risk level of a scanned server. Although the scan process itself is highly automated, vulnerability scanners can have a high false positive error rate (reporting vulnerabilities when none exist).
Unfortunately, there is no ultimate method to protect the web servers entirely. At 0th Root, we provide a solution to secure internal websites and applications of an organization using TLS client certificates. This approach has significant advantages and implements strong authentication mechanism with public/private key pair.