Homelab, summer 2025
Picking up from the prelude, here’s where my homelab is at as of late summer 2025.
Hardware
Jump to section titled: HardwareThe donated server motherboards I have are SuperMicro X10SLM-F or X9SCL±F. They each have a 4-core Intel Xeon E3-1231 3.4GHz processor, and 32GB of DDR3 RAM. I also got a handful of misc 500GB - 2TB Western Digital hard disks.
These are now dated such that that I get they’re no longer beefy enough to run a modern colo business from, so it makes sense to decommission them. And this was a generous donation to be sure; I feel very fortunate to have this stuff.
But on the other hand, it is driving me absolutely bonkers that the economies of scale are such that it was probably not worth the colo owner’s time to try and sell these boards. I needed to buy a 2U case, a power supply, and 4 quiet Noctua fans to use them. For the same price as those parts you can find whole servers in your local marketplace app. So you don’t even need the donation.
This is a weird and disappointing window, where we are drowning in hardware that is perfectly usable and complete overkill for my usecase, but the previous owner was somehow rational in saying “meh, i was gonna toss em, you can have em if you want”.
My product and design brain is chewing on what kind of opportunity this presents, but we’re here to nerd out about homelab stuff.
Here’s my current setup, sitting just to the left of my desk in my office. With the above-mentioned fans it’s not noisy or distracting at all. I can hear a slight hum, about equivalent to the sound of crickets/wind/traffic coming through that window. And of course I love das blinkenlights.

Top to bottom:
- Retro ThinkCentre monitor (found at the side of the road)
- Eero Pro 6E mesh wifi router (hardwired Cat6 into house)
- 5- and 8-port Netgear switches
- Frankenbox
- Empty spot where a Raspberry Pi Zero 2 W used to be (moved it indoors)
- Backup server
- Blank
- Main server
Feel free to poke fun at my good-enough-for-now cable management, and the 3D printed shelf that’s not quite cutting it.
To keep this short, I’ll put details on the other servers and networking into separate posts, coming soon. Here are details on the rack.
Operating system
Jump to section titled: Operating systemAfter briefly considering doing everything bare-metal on top of Debian, I landed on installing Proxmox VE.
This is also somewhat ridiculous, as Proxmox is full of enterprise IT deployment features. The words “datacenter”, “cluster”, “distributed storage”, and “high availability failover” belong nowhere near what I’m doing.
But, you can just ignore all that. Proxmox was quite simple to install and get started with, and it makes spinning up VMs trivial to do. It also gives nice overviews of network and hardware utilization, and makes managing disks easier (I bought a 240GB SSD for Proxmox to live on, and configured 4 x 1TB drives into a 2.8TB RAID).
Services
Jump to section titled: ServicesI’ll spare you the details on my entire config (happy to share, though) and just summarize what I’m running at the moment.
Split across several VMs (each running Debian, with Docker and Portainer for easy installation and management) I have:
- Docmost — Wiki/document sharing
- Forgejo — Project/issue tracker (not git hosting yet)
- Opengist — Snippet storage
- Miniflux — RSS feed aggregator/reader
- Jellyfin — Media server
- Mafl — Dashboard
- Calibre — E-book management
- Radicale — Contact & calendar management
- Home Assistant — Smart home stuff (not really using)
- Raw SMB share to mount storage in Finder
- I briefly ran an LLM server with Ollama (worked fine as a POC, but inside an underpowered VM was of limited utility)
Thoughts so far
Jump to section titled: Thoughts so farI’ve been chipping away at this, very much on the side as a series of projects that occasionally drop off my radar, for probably a little over a year.
I’m not fully running off this yet. There’s more I want to do before it’s usable and I feel confident, and I have a bunch of stuff to migrate and/or organize. But it’s getting very close to that point.
My takeaways so far, FWIW:
This has been 90% really fun. It’s great to be learning new skills, and reconnecting to and leveraging old ones. If only for the renewed sense of empowerment and possibility, it has been very worth it.
The remaining 10% is my awareness that I am, essentially, voluntarily acting a devops and network engineer, two whole entire separate jobs that most people consider work, for “fun”. It does occasionally feel like why am I cosplaying as a sysadmin? Why am I doing more work in my spare time? Why am I staring at a screen and a missing semicolon again? Maybe go outside for a bit? I am reminded that past this proof-of-concept point, there is not just work but risk and obligation. The idea of self-hosting feels empowering, if not rebellious, but to do it for realz involves a lot more thought. This is probably a whole other post, but in the meantime, Drew Lyton’s The Future is NOT Self-Hosted is full of good things to consider.
But I’m not going to let that stop me, so don’t let it stop you either. If you have some coding or command-line experience and can follow a decently-written “how to install x on y” article, then grab whatever spare bit of cheap or legacy hardware you can get your hands on and get started.