Automating Database Tasks Across Heterogeneous Database Environments

Bhanu Reddy
7 min readMay 24, 2023
Multiple types of Databases.

OBJECTIVE:

If your organization utilizes multiple types of databases such as MSSQL, MySQL, Postgres, and more, managing the security, user creation, user tracking, monitoring user activity, and handling user modifications or deletions can become a complex and challenging task. In this blog, we will delve into an effective approach to address these scenarios using automation, streamlining the management of diverse databases and ensuring efficient and secure operations.

OVERVIEW:

To streamline the above things in a diverse database environment, the following high-level steps need to be implemented.

  1. Always create any DB in a private subnet and use the bastion host to connect to the DB.
  2. Disallow any user or application to connect to DB directly via VPN or locally.
  3. Connection to DB should go through Bastion host only and use VPN to connect to Bastion host.
  4. Use one single bastion host to connect to multiple types of DBs(RDS, self-hosted and etc.)
  5. Have connectivity to all the DBs from the bastion host.
  6. Use EC2 user credentials to perform SSH-Tunneling from any DB tool like DBeaver.
  7. In the Bastion host, have different shell scripts to create users in different DB instances(RDSs, self-hosted DBs and etc).
  8. Create one EC2 user for each user(dev, QA, etc) in the bastion host to perform SSH-Tunneling for all the DBs he/she wants to access.
  9. Implement monitoring on both EC2 users in the bastion host and database users.
  10. Have Jenkins Job disable the EC2 user and DB users in case of any bad queries.

Here’s a step-by-step guide to implementing the automation for databases:

Architectural Diagram

STEP 1: Create an EC2 instance in the public subnets and enforce VPN to access it.

Choose the EC2 instance type, based on no of users and no of databases you have in your organization. Ensure you have HA and resiliency of the bastion host server and make sure you don’t run out of CPU and Memory throughout this approach.

Go with the EC2 graviton type of instances to enable the best performance at the best price. Here are some of the assumptions based on no of users and no of databases.

  1. t4g.medium(2 vCPUs and 8 GB Memory) for 50–100 users and 1–2 DBs.
  2. t4g.xlarge(4 vCPUs and 16GB Memory) for 100–200 users and 3–5 DBs.
  3. t4g.2xlarge(8 vCPUs and 32GB Memory) for 200–400 users and 6–10 DBs.

Make sure users should connect to the bastion host via VPN only, and whitelist VPN IPs to the SSH(22) port. Set up connectivity to all the DBs in the private subnets from the bastion host server by allowing respective DB ports with the security group of the bastion host in the respective DB security groups. Do not allow any rule in the security groups of the DBs to directly access the DB from locally or using VPN. If the DBs are in a separate VPC from the bastion host server VPC, you need to have VPC peering connection between them.

STEP 2: Install postfix on the bastion host and configure it to send emails using SES-SMTP.

NOTE: you need to create an SES user in your AWS account in the region your company operate. credentials will be like this:

IAM User Name,SMTP_USERNAME,SMTP_PASSWORD
"ses-smtp-user.1234567898",JKIKSHDHONIVIRAT,BCoqJg72sCKJcVBJKHjgBofB28f2stjjR45+HitMan
  1. Connect to your EC2 instance using SSH.
  2. Install Postfix by running the following command:
sudo apt-get install postfix

3. During the installation process, you will be prompted to configure Postfix. Select “Internet Site” as the configuration type and enter the fully qualified domain name (FQDN) of your server when prompted.

4. Once Postfix is installed, edit the main configuration file using the following command:

sudo nano /etc/postfix/main.cf

5. Add the following lines to the end of the file:

relayhost = email-smtp.ap-south-1.amazonaws.com:587
smtp_sasl_auth_enable = yes
smtp_sasl_security_options = noanonymous
smtp_sasl_password_maps = hash:/etc/postfix/sasl_passwd
smtp_use_tls = yes
smtp_tls_security_level = encrypt
tls_random_source = dev:/dev/urandom

6. Save and close the file.
7. Create a file called sasl_passwd using the following command:

sudo nano /etc/postfix/sasl_passwd

8. Add the following line to the file, replacing “SMTP_USERNAME” with your SES SMTP username and “SMTP_PASSWORD” with your SES SMTP password:

email-smtp.ap-south-1.amazonaws.com:587 SMTP_USERNAME:SMTP_PASSWORD

9. Save and close the file.
10. Hash the sasl_passwd file using the following command:

sudo postmap /etc/postfix/sasl_passwd

11. Restart Postfix using the following command:

sudo systemctl restart postfix

12. You should now be able to send emails from your EC2 instance using SES-SMTP. You can test this by sending an email using the mail command:

echo "This is a test email" | mail -s "Test Email" -a "From: devops@orgname.com" bhanu.prathap@orgname.com

STEP: 3 Write scripts to create EC2 users, and DB users and send credentials via email.

Here is the automation to create 100s of users at a single time and send the credentials via email. Create an EC2 user to perform SSH tunnelling to connect to the DB. while creating EC2 and DB users, input the usernames and emails from another text file and randomly generate the passwords from the script itself to map with the respective users. Please change the DB commands according to your need, the below is just an example.

#!/bin/bash

# Set the path to the file containing the list of usernames
USER_FILE="/home/ubuntu/userlist.txt"

# syntax of the userlist.txt file is : username,username.orgname.com

# Set the email subject and body
EMAIL_SUBJECT="Your EC2 and DB credentials have been created"
EMAIL_BODY="Dear user,\n\nYour new EC2 and DB credentials for DB SSH Tunneling has been created. Please use the following credentials to log in:\n\n"

# Declare an associative array to store the passwords and emails for each user
declare -A USER_PASSWORDS

# Loop through the lines in the file
while IFS=',' read -r USERNAME EMAIL
do
# Create the EC2 user with the given username
sudo useradd ${USERNAME}

# Generate a random password for EC2 credentials
EC2_PASSWORD=$(openssl rand -base64 12)

# Set the user's password to the generated password for EC2 credentials
echo ${USERNAME}:${EC2_PASSWORD} | sudo chpasswd

# Generate a random password for DB credentials
DB_PASSWORD=$(openssl rand -base64 12)

# Create a new psql db user with the given username and random password
sudo psql --certs-dir=/home/ubuntu/certs/ --host=database.orgname.com -p=26257 --execute "CREATE USER ${USERNAME} WITH PASSWORD '${DB_PASSWORD}' VALID UNTIL '10/07/2023';"

# Grant read only permissions to the user
sudo psql --certs-dir=/home/ubuntu/certs/ --host=databasedb.orgname.com -p=26257 --execute "GRANT read_write_role TO ${USERNAME}"

# Store the EC2 and DB passwords and email for this user in the associative array
USER_PASSWORDS[$USERNAME]="$EC2_PASSWORD,$DB_PASSWORD,$EMAIL"

# Force the EC2 user to change their password during their first login
sudo passwd -e ${USERNAME}

# Set the EC2 user password expiry to 90 days
sudo chage -M 90 ${USERNAME}

done < $USER_FILE

# Send an email to each user with their corresponding passwords
for USERNAME in "${!USER_PASSWORDS[@]}"
do
PASSWORDS_EMAIL="${USER_PASSWORDS[$USERNAME]}"
EC2_PASSWORD=$(echo "$PASSWORDS_EMAIL" | cut -d',' -f1)
DB_PASSWORD=$(echo "$PASSWORDS_EMAIL" | cut -d',' -f2)
EMAIL=$(echo "$PASSWORDS_EMAIL" | cut -d',' -f3)

# Append the username and passwords to the email body
EMAIL_BODY="Dear user,\n\nYour new EC2 and DB credentials for DB SSH Tunneling has been created. Please use the following credentials to log in:\n\nEC2 Username: ${USERNAME}\nEC2 Password: ${EC2_PASSWORD}\nDB Username: ${USERNAME}\nDB Password: ${DB_PASSWORD}\n\n"

# Send the email to the user's email address
echo -e "${EMAIL_BODY}" | mail -a "From: devops@orname.com" -s "${EMAIL_SUBJECT}" ${EMAIL}
done

# Removing the user-lits.txt file for security reasons
sudo rm /home/ubuntu/readwrite.txt

STEP: 4 Create a Jenkins job to disable the user in case of bad queries.

Here is the automation to cancel all the bad queries of specific users and disable the user to connect to the DB via SSH Tunneling. Create a Jenkins job and make sure the Jenkins instance has a connection to the Bastion host server. Place the below shell script in the Jenkins job and modify it according to your need to disable the user in case he/she is running any bad queries on any DB which impacts the DB CPU and Memory. Please change the DB commands according to your need, the below is just an example.

#!/bin/bash

# This script will take input from the jenkins build parameters.

echo "Logging into DB Jump server"

cd /home/ubuntu

ssh ubuntu@10.20.30.40 << EOF

execution=$1
username=$2
database=$3

if [ $execution == "L" ]
then
if [[ "$database" == "selfhosted" ]]; then
echo "List of all queries running by the DB user: $username"
# EX: sudo psql --certs-dir=/home/ubuntu/certs/ --host=selfhosted.orgname.com -p=26257 --execute "SELECT * FROM [SHOW CLUSTER QUERIES] WHERE user_name = '$username';"

echo "Killing all the queries of the DB user: $username"
# sudo psql sql --certs-dir=/home/ubuntu/certs/ --host=selfhosted.orgname.com -p=26257 --execute "CANCEL QUERIES SELECT query_id FROM [SHOW CLUSTER QUERIES] WHERE user_name = '$username';"
elif [[ "$database" == "rds" ]]; then
echo "List of all queries running by the DB user: $username"
# Execute RDS-specific commands here
else
echo "Unsupported database: $database"
exit 1
fi

echo "Disabling the EC2 user: $username"
sudo usermod -$execution -e 1 $username

echo "Stopping all the processes owned by the EC2 user: $username"
sudo killall -STOP -u $username

echo "Killing all the processes owned by the EC2 user: $username"
sudo killall -KILL -u $username

else
echo "Enabling the user: $username"
sudo usermod -$execution -e -1 $username

echo "Now the user can run queries on the DB"
fi

echo "done"

EOF

CONCLUSION :

To enhance monitoring and alerting capabilities, it is crucial to set up proactive monitoring for identifying and addressing any bad queries in the database. By configuring monitoring tools, such as Datadog or Prometheus, you can track query performance, execution times, and resource utilization. When an alert is triggered due to a potentially problematic query, it is essential to receive timely notifications through channels like Slack or email. These alerts allow you to promptly investigate and validate the user query by accessing the DB console or utilizing query analysis tools. It is important to note that this approach is primarily intended to provide access to users like developers, analytics teams, and QA personnel who require direct database access for troubleshooting or optimization purposes. However, it is strongly discouraged to grant this level of access to application users, which are typically used by microservices to communicate with the database. Application users should follow a more secure and controlled approach, such as using dedicated API endpoints or service accounts, to interact with the database, ensuring better security and separation of concerns.

--

--