Guide to the Secure Configuration of Red Hat Enterprise Linux 7
with profile Upstream DISA STIG for Red Hat Ansible TowerThis is a *draft* profile for STIG. This profile is being developed under the DoD consensus model to become a STIG in coordination with DISA FSO.
scap-security-guide
package which is developed at
https://www.open-scap.org/security-policies/scap-security-guide.
Providing system administrators with such guidance informs them how to securely configure systems under their control in a variety of network roles. Policy makers and baseline creators can use this catalog of settings, with its associated references to higher-level security control catalogs, in order to assist them in security baseline creation. This guide is a catalog, not a checklist, and satisfaction of every item is not likely to be possible or sensible in many operational scenarios. However, the XCCDF format enables granular selection and adjustment of settings, and their association with OVAL and OCIL content provides an automated checking capability. Transformations of this document, and its associated automated checking content, are capable of providing baselines that meet a diverse set of policy objectives. Some example XCCDF Profiles, which are selections of items that form checklists and can be used as baselines, are available with this guide. They can be processed, in an automated fashion, with tools that support the Security Content Automation Protocol (SCAP). The DISA STIG for Red Hat Enterprise Linux 7, which provides required settings for US Department of Defense systems, is one example of a baseline created from this guidance.
This benchmark is a direct port of a SCAP Security Guide benchmark developed for Red Hat Enterprise Linux. It has been modified through an automated process to remove specific dependencies on Red Hat Enterprise Linux and to function with Scientifc Linux. The result is a generally useful SCAP Security Guide benchmark with the following caveats:
- Scientifc Linux is not an exact copy of Red Hat Enterprise Linux. Scientific Linux is a Linux distribution produced by Fermi National Accelerator Laboratory. It is a free and open source operating system based on Red Hat Enterprise Linux and aims to be "as close to the commercial enterprise distribution as we can get it." There may be configuration differences that produce false positives and/or false negatives. If this occurs please file a bug report.
- Scientifc Linux is derived from the free and open source software made available by Red Hat, but it is not produced, maintained or supported by Red Hat. Scientifc Linux has its own build system, compiler options, patchsets, and is a community supported, non-commercial operating system. Scientifc Linux does not inherit certifications or evaluations from Red Hat Enterprise Linux. As such, some configuration rules (such as those requiring FIPS 140-2 encryption) will continue to fail on Scientifc Linux.
Members of the Scientifc Linux community are invited to participate in OpenSCAP and SCAP Security Guide development. Bug reports and patches can be sent to GitHub: https://github.com/OpenSCAP/scap-security-guide. The mailing list is at https://fedorahosted.org/mailman/listinfo/scap-security-guide.
Profile Title | Upstream DISA STIG for Red Hat Ansible Tower |
---|---|
Profile ID | xccdf_org.ssgproject.content_profile_stig-ansible-tower-upstream |
Revision History
Current version: 0.1.38
- draft (as of 2018-03-05)
Platforms
- cpe:/o:redhat:enterprise_linux:7
- cpe:/o:scientificlinux:scientificlinux:7
- cpe:/o:redhat:enterprise_linux:7::client
- cpe:/o:redhat:enterprise_linux:7::computenode
Table of Contents
Checklist
contains 26 rules |
Services [ref]group
The best protection against vulnerable software is running less software. This section describes how to review
the software which Red Hat Enterprise Linux 7 installs on a system and disable software which is not needed. It
then enumerates the software packages installed on a default Red Hat Enterprise Linux 7 system and provides guidance about which
ones can be safely disabled.
|
contains 26 rules |
Web Server [ref]groupThe web server is responsible for providing access to
content via the HTTP protocol. Web servers represent a significant
security risk because:
The system's default web server software is Apache 2 and is provided in the RPM package httpd . |
contains 26 rules |
Secure Apache Configuration [ref]groupThe |
contains 26 rules |
Minimize Web Server Loadable Modules [ref]group
A default installation of $ sudo service httpd configtestThe purpose of each of the modules loaded by default will now be addressed one at a time. If none of a module's directives are being used, remove it. |
contains 1 rule |
httpd Core Modules [ref]group
These modules comprise a basic subset of modules that are likely needed for base LoadModule auth_basic_module modules/mod_auth_basic.so LoadModule authn_default_module modules/mod_authn_default.so LoadModule authz_host_module modules/mod_authz_host.so LoadModule authz_user_module modules/mod_authz_user.so LoadModule authz_groupfile_module modules/mod_authz_groupfile.so LoadModule authz_default_module modules/mod_authz_default.so LoadModule log_config_module modules/mod_log_config.so LoadModule logio_module modules/mod_logio.so LoadModule setenvif_module modules/mod_setenvif.so LoadModule mime_module modules/mod_mome.so LoadModule autoindex_module modules/mod_autoindex.so LoadModule negotiation_module modules/mod_negotiation.so LoadModule dir_module modules/mod_dir.so LoadModule alias_module modules/mod_alias.soMinimizing the number of loadable modules available to the web server reduces risk by limiting the capabilities allowed by the web server. |
contains 1 rule |
Enable log_config_module For HTTPD Logging [ref]rule
The log_config_moduleRationale: A major tool in exploring the web site use, attempted use, unusual conditions, and problems are reported in the access and error logs. In the event of a security incident, these logs can provide the SA and the web manager with valuable information. Without these log files, SAs and web managers are seriously hindered in their efforts to respond appropriately to suspicious or criminal actions targeted at the web site. Severity: medium |
Directory Restrictions [ref]groupThe Directory tags in the web server configuration file allow finer grained access control for a specified directory. All web directories should be configured on a case-by-case basis, allowing access only where needed. |
contains 4 rules |
Disable Anonymous FTP Access [ref]ruleIf any directories that contain dynamic scripts can be accessed via FTP by any group or user that does not require access, remove permissions to such directories that allow anonymous access. Also, ensure that any such access employs an encrypted connection. Rationale:The directories containing the CGI scripts, such as PERL, must not be accessible to anonymous users via FTP. This applies to all directories that contain scripts that can dynamically produce web pages in an interactive manner (i.e., scripts based upon user-provided input). Such scripts contain information that could be used to compromise a web service, access system resources, or deface a web site. Severity: medium |
Ignore HTTPD .htaccess Files [ref]rule
Set CGI scripts represents one of the most common and exploitable means of compromising a web server. By definition, CGI are executable by the operating system of the host server. While access control is provided via the web service, the execution of CGI programs is not otherwise limited unless the SA or Web Manager takes specific measures. CGI programs can access and alter data files, launch other programs and use the network. CGI programs can be written in any available programming language. C, PERL, PHP, Javascript, VBScript and shell (sh, ksh, bash) are popular choices. Severity: medium |
Web Content Directories Must Not Be Shared Anonymously [ref]rule
Web content directories should not be shared anonymously over remote filesystems
such as Sharing web content is a security risk when a web server is involved. Users accessing the share anonymously could experience privileged access to the content of such directories. Network sharable directories expose those directories and their contents to unnecessary access. Any unnecessary exposure increases the risk that someone could exploit that access and either compromises the web content or cause web server performance problems. Severity: medium |
Remove Write Permissions From Filesystem Paths And Server Scripts [ref]rule
Configure permissions for each instance of $ sudo find DIR -type d -exec chmod 755 {} \; $ sudo find DIR -type f -exec chmod 555 {} \;Where DIR matches the paths from Alias ,
ScriptAlias , and ScriptAliasMatch .
Rationale:Excessive permissions for the anonymous web user account are one of the most common faults contributing to the compromise of a web server. If this user is able to upload and execute files on the web server, the organization or owner of the server will no longer have control of the asset. Severity: high |
Use Appropriate Modules to Improve httpd's Security [ref]group
Among the modules available for |
contains 3 rules |
Deploy mod_ssl [ref]group
Because HTTP is a plain text protocol, all traffic is susceptible to passive
monitoring. If there is a need for confidentiality, SSL should be configured
and enabled to encrypt content.
|
contains 3 rules |
Enable Transport Layer Security (TLS) Encryption [ref]rule
Disable old SSL and TLS version and enable the latest TLS encryption by setting
the following in SSLProtocol all -SSLv2 -SSLv3 -TLSv1 -TLSv1.1Make sure to also set SSLEngine to on in
/etc/httpd/conf.modules.d/ssl.conf like the following:
SSLEngine onRationale: Transport Layer Security (TLS) encryption is a required security setting for a private web server. Encryption of private information is essential to ensuring data confidentiality. If private information is not encrypted, it can be intercepted and easily read by an unauthorized party. A web server must use a FIPS 140-2 approved TLS version, and all non-FIPS-approved SSL versions must be disabled. Severity: medium |
Require Client Certificates [ref]rule
SSLVerifyClient requireRationale: Web sites requiring authentication within the DoD must utilize PKI as an authentication mechanism for web users. Information systems residing behind web servers requiring authorization based on individual identity must use the identity provided by certificate-based authentication to support access control decisions. Severity: medium |
Configure A Valid Server Certificate [ref]ruleConfigure the web site to use a valid organizationally defined certificate. For DoD, this is a DoD server certificate issued by the DoD CA. Rationale:This check verifies that DoD is a hosted web site's CA. The certificate is actually a DoD-issued server certificate used by the organization being reviewed. This is used to verify the authenticity of the web site to the user. If the certificate is not for the server (Certificate belongs to), if the certificate is not issued by DoD (Certificate was issued by), or if the current date is not included in the valid date (Certificate is valid from), then there is no assurance that the use of the certificate is valid. The entire purpose of using a certificate is, therefore, compromised. Severity: medium |
Configure PHP Securely [ref]groupPERL (Practical Extraction and Report Language) is an interpreted language optimized for scanning arbitrary text files, extracting information from those text files, and printing reports based on that information. The language is often used in shell scripting and is intended to be practical, easy to use, and efficient means of generating interactive web pages for the user. |
contains 1 rule |
Configure HTTP PERL Scripts To Use TAINT Option [ref]rule
If the PerlSwitches -TRationale:
PERL (Practical Extraction and Report Language) is an interpreted language
optimized for scanning arbitrary text files, extracting information from those
text files, and printing reports based on that information. The language is
often used in shell scripting and is intended to be practical, easy to use, and
efficient means of generating interactive web pages for the user. Unfortunately,
many widely available freeware PERL programs (scripts) are extremely insecure.
This is most readily accomplished by a malicious user substituting input to a
PERL script during a POST or a GET operation.
Severity: medium |
Configure Operating System to Protect Web Server [ref]groupThe following configuration steps should be taken on the system which hosts the web server, in order to provide as safe an environment as possible for the web server. |
contains 5 rules |
Restrict File and Directory Access [ref]group
Minimize access to critical |
contains 2 rules |
Set Permissions on the /var/log/httpd/ Directory [ref]ruleEnsure that the permissions on the web server log directory is set to 700: $ sudo chmod 700 /var/log/httpd/This is its default setting. Rationale: A major tool in exploring the web site use, attempted use, unusual conditions, and problems are the access and error logs. In the event of a security incident, these logs can provide the SA and the web manager with valuable information. To ensure the integrity of the log files and protect the SA and the web manager from a conflict of interest related to the maintenance of these files, only the members of the Auditors group will be granted permissions to move, copy, and delete these files in the course of their duties related to the archiving of these files. Severity: medium References: CM-7 |
HTTPD Log Files Must Be Owned By Root [ref]rule
All $ sudo chown root /var/log/httpdTo properly set the owner of /var/log/httpd/* , run the command:
$ sudo chown root /var/log/httpd/*Rationale: A major tool in exploring the web site use, attempted use, unusual conditions, and problems are the access and error logs. In the event of a security incident, these logs can provide the SA and the web administrator with valuable information. Because of the information that is captured in the logs, it is critical that only authorized individuals have access to the logs. Severity: medium |
Ensure Remote Administrative Access Is Encrypted [ref]rule
Ensure that the SSH server service is enabled.
The $ sudo systemctl enable sshd.serviceRationale:
Logging into a web server remotely using an unencrypted protocol or service
when performing updates and maintenance is a major risk. Data, such as user
account, is transmitted in plaintext and can easily be compromised. When
performing remote administrative tasks, a protocol or service that encrypts the
communication channel must be used.
Severity: high |
Scan All Uploaded Content for Malicious Software [ref]ruleInstall anti-virus software on the system and set it to automatically scan new files that are introduced to the web server. Rationale:Remote web authors should not be able to upload files to the Document Root directory structure without virus checking and checking for malicious or mobile code. A remote web user, whose agency has a Memorandum of Agreement (MOA) with the hosting agency and has submitted a DoD form 2875 (System Authorization Access Request (SAAR)) or an equivalent document, will be allowed to post files to a temporary location on the server. All posted files to this temporary location will be scanned for viruses and content checked for malicious or mobile code. Only files free of viruses and malicious or mobile code will be posted to the appropriate DocumentRoot directory. Severity: medium |
Configure firewalld to Allow Access to the Web Server [ref]rule
By default, Failure to comply with DoD ports, protocols, and services (PPS) requirements can result in compromise of enclave boundary protections and/or functionality of the AIS. Severity: low |
Configure HTTPD-Served Web Content Securely [ref]group
Running ChrootDir /chroot/apacheThis necessitates placing all files required by httpd inside
/chroot/apache , including httpd 's binaries, modules,
configuration files, and served web pages. The details of this configuration
are beyond the scope of this guide. This may also require additional SELinux
configuration.
|
contains 7 rules |
Each Web Content Directory Must Contain An index.html File [ref]rule
Every The goal is to completely control the web users experience in navigating any portion of the web document root directories. Ensuring all web content directories have at least the equivalent of an index.html file is a significant factor to accomplish this end. Also, enumeration techniques, such as URL parameter manipulation, rely upon being able to obtain information about the web server's directory structure by locating directories with default pages. This practice helps ensure that the anonymous web user will not obtain directory browsing information or an error message that reveals the server type and version. Severity: low |
Ensure Web Content Located on Separate partition [ref]rule
The Application partitioning enables an additional security measure by securing user traffic under one security context, while managing system and application files under another. Web content is can be to an anonymous web user. For such an account to have access to system files of any type is a major security risk that is avoidable and desirable. Failure to partition the system files from the web site documents increases risk of attack via directory traversal, or impede web site availability due to drive space exhaustion. Severity: medium |
Encrypt All File Uploads [ref]ruleUse only secure encrypted logons and connections for uploading files to the web site. Rationale:Logging in to a web server via an unencrypted protocol or service, to upload documents to the web site, is a risk if proper encryption is not utilized to protect the data being transmitted. An encrypted protocol or service must be used for remote access to web administration tasks. Severity: high |
Configure A Banner Page For Each Website [ref]ruleConfigure a login banner for each website when authentication is required for user access. Rationale:A consent banner will be in place to make prospective entrants aware that the website they are about to enter is a DoD web site and their activity is subject to monitoring. The document, DoDI 8500.01, establishes the policy on the use of DoD information systems. It requires the use of a standard Notice and Consent Banner and standard text to be included in user agreements. The requirement for the banner is for websites with security and access controls. These are restricted and not publicly accessible. If the website does not require authentication/authorization for use, then the banner does not need to be present. A manual check of the document root directory for a banner page file (such as banner.html) or navigation to the website via a browser can be used to confirm the information provided from interviewing the web staff. Severity: low |
The robots.txt Files Must Not Exist [ref]rule
Remove any $ sudo rm -f path/to/robots.txtRationale:
Search engines are constantly at work on the Internet. Search engines are
augmented by agents, often referred to as spiders or bots, which endeavor to
capture and catalog web-site content. In turn, these search engines make the
content they obtain and catalog available to any public web user.
Severity: medium |
Disable Web Content Symbolic Links [ref]rule
For each FollowSymLinksIf symbolic links are allowed, the following can be added for each <Directory> instance:
Options SymLinksIfOwnerMatchDisableRationale: A symbolic link allows a file or a directory to be referenced using a symbolic name raising a potential hazard if symbolic linkage is made to a sensitive area. When web scripts are executed and symbolic links are allowed, the web user could be allowed to access locations on the web server that are outside the scope of the web document root or home directory. Severity: high |
Remove .java And .jpp Files [ref]rule
From the source code in a .java or a .jpp file, the Java compiler produces a binary file with an extension of .class. The .java or .jpp file would, therefore, reveal sensitive information regarding an application's logic and permissions to resources on the server. By contrast, the .class file, because it is intended to be machine independent, is referred to as bytecode. Bytecodes are run by the Java Virtual Machine (JVM), or the Java Runtime Environment (JRE), via a browser configured to permit Java code. Severity: low |
Enable HTTPD Error Logging [ref]rule
ErrorLog "logs/error_log"Rationale: The server error logs are invaluable because they can also be used to identify potential problems and enable proactive remediation. Log data can reveal anomalous behavior such as "not found" or "unauthorized" errors that may be an evidence of attack attempts. Failure to enable error logging can significantly reduce the ability of Web Administrators to detect or remediate problems. Severity: medium |
Configure Error Log Format [ref]rule
LogFormat "a %A %h %H %l %m %s %t %u %U \"%{Referer}i\" \"%{User-Agent}i\"" combinedRationale: The server error logs are invaluable because they can also be used to identify potential problems and enable proactive remediation. Log data can reveal anomalous behavior such as "not found" or "unauthorized" errors that may be an evidence of attack attempts. Failure to enable error logging can significantly reduce the ability of Web Administrators to detect or remediate problems. The LogFormat directive defines the format and information to be included in the access log entries. Severity: medium |
Enable HTTPD System Logging [ref]rule
CustomLog "logs/access_log" combinedRationale: The server error logs are invaluable because they can also be used to identify potential problems and enable proactive remediation. Log data can reveal anomalous behavior such as "not found" or "unauthorized" errors that may be an evidence of attack attempts. Failure to enable error logging can significantly reduce the ability of Web Administrators to detect or remediate problems. The CustomLog directive specifies the log file, syslog facility, or piped logging utility. Severity: medium |
Enable HTTPD LogLevel [ref]rule
LogLevel warnRationale: The server error logs are invaluable because they can also be used to identify potential problems and enable proactive remediation. Log data can reveal anomalous behavior such as "not found" or "unauthorized" errors that may be an evidence of attack attempts. Failure to enable error logging can significantly reduce the ability of Web Administrators to detect or remediate problems. While the ErrorLog directive configures the error log file name, the LogLevel directive is used to configure the severity level for the error logs. The log level values are the standard syslog levels: emerg, alert, crit, error, warn, notice, info and debug. Severity: medium |
Configure The Number of Allowed Simultaneous Requests [ref]rule
The MaxKeepAliveRequests 100Rationale: Resource exhaustion can occur when an unlimited number of concurrent requests are allowed on a web site, facilitating a denial of service attack. Mitigating this kind of attack will include limiting the number of concurrent HTTP/HTTPS requests per IP address and may include, where feasible, limiting parameter values associated with keepalive, (i.e., a parameter used to limit the amount of time a connection may be inactive). Severity: medium |