Another day, another brunch, another #IrnBru “legend edition”. Today it’s the #unicorn, the national animal of #Scotland, and #UnicornTears flavour (some kind of berry flavour, possibly raspberry). Hopefully no unicorns were harmed or exploited in its production? #Slurm
At the risk of doing unpaid marketing for them, interesting to see that #IrnBru (a fizzy drink from #Scotland, one of very few countries in the world where a local drink outsells Big Cola) currently has a #NessieNectar flavour (vaguely fruity, less iron-y/ferric than normal). Hopefully no loch monsters were harmed or used in its production? #Slurm
A little tutorial on speeding up bioinformatic work on a computing cluster by converting your loops to array jobs
https://plantarum.ca/2025/01/24/array-jobs/
[edit: fixed link]
On Linux, Slurm is also a command-line network monitoring tool in real time. This tool lets you monitor traffic on your network and display the statistics with an ASCII graph. Three different types of graphs are available on this tool.
Does anyone have a working #GitHub action to create a fake #SLURM cluster within a CI action?
(ours keeps failing, e.g. https://github.com/snakemake/snakemake-executor-plugin-slurm/actions/runs/12235923998/job/34193736191?pr=159)
The challenge today is to make #genAI converting a Dockerfile to an Apptainer/Singularity file.
I think I could abuse it quite heavily
@K_REY_C @doctormo As I read this I was literally just on a work call where somebody suggested that if our company were to financially support open source development that our employees would contribute to #foss and could design a gui for #slurm that could rival the gui for #jenkins
I laughed out loud.
The #payforfreesoftware can't just be about individuals donating coffee to github projects. If it does not incentivize corporate middle managers to pay for free software the needle won't move.
I've been submitting single #HPC jobs that loop over long lists of #fastq sequence files in my #GBS pipeline. Using this approach, my current project was going to take 50+ days to get through just #ustacks.
Just figured out how to use #slurm job arrays, and it looks like I'll be done inside 24 hours!
Just made a release of the #SLURM executor plugin for #Snakemake. I added some stability features.
#HPC #ReproducibleResearch #dataanalysis #bioinformatics
1/n
#SLURM sometimes drives me crazy. Does anyone know a place to find old changelogs?
#Bioinformatics question: does #snakemake actually do anything that can't be done with a well-structured #bash script? Or, in my case, with a single #emacs #orgmode file that contains all my scripts, #slurm submissions and #rstats analysis, interleaved with discussion and links to sources?
Our #HPC cluster folks seem to think #snakemake is 'next level' for reproducibility, but I'm starting to think that's because they don't realize how much you can do with existing tools like Bash and Emacs.
@johnzajac Exactly the reason I snapped at a moron that invaded my space as I was refilling my #Slurm the other day at the convenience store. My only regret is there were a couple of kids in the store that heard me calling him an f-ing moron.
If anyone is interested in testing a backport of #SLURM 23.02 for Ubuntu 22.04 LTS, I published a PPA yesterday that contains the necessary Debian packages (I had to backport rocm-smi-lib too):
```bash
sudo add-apt-repository ppa:ubuntu-hpc/slurm-wlm-23.02
sudo apt install <slurm package>
```
I'm keen on making sure that it works for other people too and not just me
Job keeps running out of memory and I keep increasing the memory request until I notice I've got a typo so it's been running on the default 4GB every time ...
#slurm #programming #analysis