toad.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
Mastodon server operated by David Troy, a tech pioneer and investigative journalist addressing threats to democracy. Thoughtful participation and discussion welcome.

Administered by:

Server stats:

240
active users

#lessonslearned

3 posts3 participants0 posts today

Current #Warhammer40K #LessonsLearned
(Pinned post that will be updated)

1. Do not go/deploy too offensive
2. Watch order of units (Immo => Castigator => Paragons/Rest)
3. Don't forget special rules of units (e.g Litanies of Faith)
4. Focus fire more on heavy units like Terminators
5. Don't forget to gain miracle dice

Continued thread

#Warhammer40K #LessonsLearned

1. Do not go/deploy too offensive
2. Watch order of units (Immo => Castigator => Paragons/Rest)
3. Don't forget special rules of units (e.g Litanies of Faith)
4. Focus fire more on heavy units like Terminators
5. Don't forget to gain miracle dice
6. Surprisingly: Celestians can be worth something, but only with a Hospitaler. They blocked DW Knights, one of the most dangerous melee units. They're more durable than Mortifiers according to Unitcrunch simulations.

Customers locked out, transactions missing, and a month of chaos. 🤯 BNF Bank's recent system update in Malta turned into a nightmare. Article analyzes the critical missteps, from potential inadequate preparation to poor communication, offering vital lessons for anyone in tech or finance.

Find out what happened: medium.com/@chribonn/bnf-banks

Medium · BNF Bank’s Gone Wrong System Update - Alan C. Bonnici - MediumBy Alan C. Bonnici

Okay, so I wanted to share a little incident from a few months back that really hammered home the power of knowing your Linux internals when things go sideways. I got a frantic call, "something weird is going on with our build server, it's acting sluggish and our monitoring is throwing odd network alerts." No fancy EDR on this particular box, just the usual ssh and bash. My heart always sinks a little when it's a Linux box with vague symptoms, because you know it's time to get your hands dirty.

First thing I did, even before reaching for any specific logs, was to get a quick snapshot of the network. Instead of netstat, which honestly feels a bit dated now, I immediately hit ss -tunap. That p is crucial cause it shows you the process and user ID for each connection. What immediately jumped out was an outbound TCP connection on a high port to a sketchy-looking IP, and it was tied to a process that definitely shouldn't have been making external calls. My gut tightened. I quickly followed up with lsof -i just to be super sure no deleted binaries were clinging on to network connections.

With that IP and PID in hand, I moved to process investigation. pstree -ap was my next stop. It showed the suspicious process, and more importantly, its parent. It wasn't a child of systemd or a normal service. It was spawned by a build script that shouldn't have been executing anything like this. That hierarchical view was key. Then, to really understand what this thing was doing, I dared to strace -p <PID>. Watching the system calls unfurl was like watching a movie of its malicious intent: it was reading from /etc/passwd, making connect() calls, and trying to write to some odd /tmp directories. Simultaneously, I checked ls -l /proc/<PID>/exe to confirm the actual binary path (it was indeed in /tmp) and /proc/<PID>/cwd to see its working directory. No doubt, this was a rogue process.

Knowing it was a fresh infection, I immediately shifted to the filesystem. My go-to is always find / -type f -newermt '2 days ago' -print0 | xargs -0 ls -latr. This quickly pulls up any files modified in the last 48 hours, sorted by modification time. It's often where you find dropped payloads, modified configuration files, or suspicious scripts. Sure enough, there were a few more binaries in /tmp and even a suspicious .sh script in a developer's home directory. I also scanned for SUID/SGID binaries with find / -perm /6000 just in case they'd dropped something for privilege escalation. And while stat's timestamps can be tampered with, I always glance at atime, mtime, and ctime on suspicious files; sometimes, a subtle mismatch offers a tiny clue if the attacker wasn't meticulous.

The final piece of the puzzle, and often the trickiest, is persistence. I checked the usual suspects: crontab -l for root and every other user account I could find. Then I cast a wider net with grep -r "suspect_domain_or_ip" /etc/cron.* /etc/systemd/system/ /etc/rc.d/ and similar common boot directories. Sure enough, a new systemd timer unit had been added that was scheduled to execute the /tmp binary periodically. Finally, I didn't forget the user dotfiles (~/.bashrc, ~/.profile, etc.). It’s surprising how often an attacker will drop a malicious alias or command in there, assuming you won't dig deep into a developer's setup.

Long story short, we quickly identified the ingress vector, isolated the compromise, and cleaned up the persistence. But what really stuck with me is how quickly you can triage and understand an incident if you're comfortable with these fundamental Linux commands. There's no substitute for getting your hands dirty and really understanding what strace is showing you or why ss is superior to netstat in a high-pressure situation. These tools are your best friends in a firefight.

I had a bit of an IT meltdown in the last couple of weeks.

I upgraded my home NAS with 2 12Tb drives. Didn't take a backup of the config - just the data. Spent a few weeks rebuilding that and restoring accounts.

My home network, driven by a Ubiquiti DM, shat itself for no reason I can see. My son helped me re-cable, reset and reconfigure everything - including all the IoT devices in the house! It's incredible how many devices are connected!

Finally, I gave up LastPass and tried...something else. Now I'm back to LastPass and manually adding in all of hundreds of unique pwds. I've forgiven them the hack and want to come back.

Overall, the three problems haven't been a lot of fun and, as the house IT admin, it's been good to have my son's skills and advice.

Note to self: backup up settings as well as data.