HTTPS management at the MBB platform

Context

I am managing many web servers, installed in many ways.

In this article, I will describe how I deploy, renew and expand Let's encrypt certificates, without knowing how this servers has been installed. Then, I will explain how I am scanning my web servers from another machine.

Deployment and renewals

After deploying a web server, with a config management tool (here we are using SaltStack) or any other tool (eg. ansible), I am deploying the great certbot tool from EFF.

This tool is also deployed with my configuration management tool.

Here is the script I am using to lookup the web service, the document root and the server names: letsencrypt.sh.

sudo wget -O /usr/local/sbin/letsencrypt.sh https://gist.githubusercontent.com/remyd1/35fdbcb740fa4fdec1a15727ad7e743c/raw/b3925aef5b7f9d356c233f3ef88d9c4ba76ba6be/letsencrypt.sh
sudo chmod +x /usr/local/sbin/letsencrypt.sh

This script is mainly used for the renewals (inside a weekly cron) and expanding my certs. I am using other scripts for the initial ACME challenge. However, this script can also be used for deployment (the initial request is a subpart of this script):

sudo bash letsencrypt.sh initial

Without any option, it will try to renew the cert (you can add it into a cron job). The multidomains option will try to expand the cert by searching for the ServerName, ServerAlias or server_name keywords.

As you can see, I am checking for apache and nginx, but also gitlab, docker and lighttpd. However, those last parts need to be improved.

Nevertheless, the path to your Let's encrypt certificates in your web server configuration need to be edited. This could be done using vi, nano or your favorite editor. It would be eventuelly optimized later, to be autimatically modified using grep and sed (in order to check the SSL lines and modify those lines if necessary).

Anyway, I am still doing it manually, and the ssl modules need to be loaded :

For instance, for apache, most of the time, it would need something like:

a2enmod ssl

CERTPATH="/etc/letsencrypt/live/"`hostname -f`
sed -ri "s@SSLCertificateFile\s+.+@SSLCertificateFile ${CERTPATH}/fullchain.pem@" /etc/apache2/sites-available/default-ssl.conf
sed -ri "s@SSLCertificateKeyFile\s+.+@SSLCertificateKeyFile ${CERTPATH}/privkey.pem@" /etc/apache2/sites-available/default-ssl.conf

a2ensite default-ssl
apache2ctl configtest
apache2ctl graceful

Note [update 2020/04/20]: I found another letsencrypt.sh script which allows you to deploy certificates (here is a french documentation on how to use it). However, It is not scanning your install in advance, contrary to my script. Nevertheless, you can decide what domain names you need by editing the code, or execute a dry run.

Checking certificates and websites URL and HTTP answers

I am using a collection of script which are launched through cron. These scripts can be retrieved in this repository

This project serves 3 main goals:

cd /opt
sudo git clone https://gitlab.mbb.univ-montp2.fr/remy/website_checks

# create a basic bash script /usr/local/sbin/check_websites.sh
cat <<EOF | sudo tee /usr/local/sbin/check_websites.sh
#!/bin/bash

LOG_DIR="/var/log/webreports"
CURMONTH=\`date '+%Y%m'\`
CURDAY=\`date '+%d'\`
DATE=\`date '+%Y%m%d_%H%M'\`
SUBDIR=\$CURMONTH"/"\$CURDAY
mkdir -p \$LOG_DIR/\$SUBDIR

bash /opt/website_checks/check_urls.sh check 2>/dev/null 1>&2

cp /opt/website_checks/workdir/checksums.json \$LOG_DIR/\$SUBDIR/"\$DATE"_websites_checksums.json
cp /opt/website_checks/workdir/status.json \$LOG_DIR/\$SUBDIR/"\$DATE"_websites_status.json
# uncomment the following to keep each scan copy
#cp -R /opt/website_checks/workdir /opt/website_checks/`date '+%Y%m%d_%H%M'`_workdir

# to generate host_https_list.txt using salt with an already https roles as grains:
# salt -G 'roles:https' cmd.run 'hostname -f' --out=yaml |awk '{print \$2}'

bash /opt/website_checks/check_certs.sh > \$LOG_DIR/\$SUBDIR/"\$DATE"_websites_certs.json

EOF

sudo chmod +x /usr/local/sbin/check_websites.sh
# now edit host_https_list.txt and url_list.txt accordingly to fit your needs
cd /opt/website_checks
sudo nano url_list.txt
sudo nano host_https_list.txt

# then init your workdir website checks URL directory
bash check_urls.sh init

# and make a backup of your workdir
sudo cp -R workdir `date '+%Y%m%d'`_workdir
# it would be useful for later

# we could then launch the script to check if everythink work as expected
sudo /usr/local/sbin/check_websites.sh

# now you have your web reports in json format, in /var/log/webreports
ls -lla /var/log/webreports/*/*

sudo crontab -e
# edit your crontab to add
*/30 * * * * /usr/local/sbin/check_websites.sh

sudo service cron reload

For instance, then, you can analyse your reports using a json library with a PHP/Python display tool.

For website defacing analysis, it uses a sha256sum. If there is a modification with the previous scan, the diff linux tool will display the website that changed.

cd /opt/website_checks/
To check any modifications since last run:
bash check_urls.sh compare
To check any modifications since first initialization:
bash check_urls.sh compare2init

It is also recommended to store backup of previous runs:

cd /opt/website_checks
sudo cp -R workdir `date '+%Y%m%d'`_workdir

Then, you can launch a new scan and a recursive diff:

sudo /usr/local/sbin/check_websites.sh
diff -rq workdir `date '+%Y%m%d'`_workdir
# on a specific file
diff -Ebw workdir/xxx.index.html `date '+%Y%m%d'`_workdir/xxx.index.html

For dynamic websites, you can add some regexps in the check_urls.sh bash scripts. These regexps are used by sed to delete the corresponding lines before performing the new checksum.